I would be interested in seeing code churn levels of these PRs.
Measuring "number of pull requests" is about as insightful as measuring "number of lines of code". Would have hoped a company specializing in software developer tooling should know that ...
thanks for the link. here's the core findings quoted from the paper, which to me reads as "yes, it produces slop fast":
Finding 1: The DiD models suggest that the adoption of Cursor only leads to a significant and large velocity gain in the short term (i.e., first two months) in open-source projects.
Finding 2: The DiD models suggest that the adoption of Cursor leads to a sustained accumulation of static analysis warnings and a sustained increase in code complexity.
Finding 3: The dynamic panel GMM models suggest that: (1) the adoption of Cursor leads to an inherently more complex codebase; (2) the accumulation of static analysis warnings and code complexity decreases development velocity in the future.
I would be interested in seeing code churn levels of these PRs. Measuring "number of pull requests" is about as insightful as measuring "number of lines of code". Would have hoped a company specializing in software developer tooling should know that ...
This other study looked a bit at the impact on the code longer term: https://news.ycombinator.com/item?id=45968758
thanks for the link. here's the core findings quoted from the paper, which to me reads as "yes, it produces slop fast":
Finding 1: The DiD models suggest that the adoption of Cursor only leads to a significant and large velocity gain in the short term (i.e., first two months) in open-source projects.
Finding 2: The DiD models suggest that the adoption of Cursor leads to a sustained accumulation of static analysis warnings and a sustained increase in code complexity.
Finding 3: The dynamic panel GMM models suggest that: (1) the adoption of Cursor leads to an inherently more complex codebase; (2) the accumulation of static analysis warnings and code complexity decreases development velocity in the future.