Slow code execution is often due to poor data structure choices. A few days ago, I wrote a feature to handle user relationships, and it became painfully slow with large data volumes. While I was struggling with it, I tried the Cursor code analysis feature, and unexpectedly, it quickly identified the problem and taught me several performance optimization techniques.
Array Lookup to Hash Table
The original code used an array to store user relationships, requiring a full traversal for each lookup, which was extremely slow:
const userRelations = [];
function addRelation(user1, user2) {
userRelations.push({ user1, user2 });
}
function findRelation(user1, user2) {
return userRelations.find(
relation =>
(relation.user1 === user1 && relation.user2 === user2) ||
(relation.user1 === user2 && relation.user2 === user1)
);
}
Cursor shook its head at this and suggested using a hash table for storage:
const userRelations = new Map();
function addRelation(user1, user2) {
const key = [user1, user2].sort().join(':');
userRelations.set(key, { user1, user2 });
}
function findRelation(user1, user2) {
const key = [user1, user2].sort().join(':');
return userRelations.get(key);
}
This change improved the lookup time from O(n) to O(1), making it incredibly fast.
Avoid Redundant Calculations
When processing data, I often like to calculate on the fly, but AI pointed out that this wastes resources:
// Original code
function processUserData(users) {
return users.map(user => {
const friends = getFriendCount(user.id); // Recalculate each time
const posts = getPostCount(user.id); // This too
return {
...user,
score: calculateScore(friends, posts)
};
});
}
Change it to use caching for intermediate results:
function processUserData(users) {
// Precompute and store all user statistics
const friendsCache = new Map();
const postsCache = new Map();
users.forEach(user => {
friendsCache.set(user.id, getFriendCount(user.id));
postsCache.set(user.id, getPostCount(user.id));
});
return users.map(user => ({
...user,
score: calculateScore(
friendsCache.get(user.id),
postsCache.get(user.id)
)
}));
}
Reduce Memory Usage
When processing large amounts of data, memory usage can also be a concern. The original code stored all data in memory:
function analyzeUserBehavior(logs) {
const allLogs = [];
// Read all logs
for (const log of logs) {
allLogs.push(parseLog(log));
}
// Analyze processing
return processLogs(allLogs);
}
AI suggested using a generator to read and process data simultaneously:
function* logGenerator(logs) {
for (const log of logs) {
yield parseLog(log);
}
}
function analyzeUserBehavior(logs) {
const result = {
totalActions: 0,
uniqueUsers: new Set()
};
for (const log of logGenerator(logs)) {
result.totalActions++;
result.uniqueUsers.add(log.userId);
}
return result;
}
Efficient Bit Manipulation
Bit manipulation is a great technique that is both fast and memory-efficient. AI noticed that I was using an array to store permissions, and suggested switching to bit manipulation:
// Before: slow and memory-heavy
const userPermissions = ['read', 'write', 'delete'];
// After: fast and space-efficient
const PERMISSION = {
READ: 1 << 0, // 1
WRITE: 1 << 1, // 2
DELETE: 1 << 2 // 4
};
let permissions = PERMISSION.READ | PERMISSION.WRITE;
// Check permissions
if (permissions & PERMISSION.READ) {
console.log('Read permission granted');
}
Utilizing these optimization techniques significantly boosted the performance of my code. However, optimization must be balanced; code readability and maintainability are also crucial. Overly complex optimizations can lead to pitfalls.
When coding, one shouldn’t just focus on functionality; performance issues will need to be addressed sooner or later. However, performance optimization is quite interesting, as solving each problem allows for new learning. AI is a great helper, but ultimately, it’s essential to understand the principles thoroughly.
Remember, when a hash table is appropriate, don’t use an array; when caching is necessary, don’t recalculate; and when using generators, don’t load everything into memory. Programming is about attention to detail.