"It's like building upon affiliate marketing income until we could reach that budget to spend." Managing that level of risk with a team of just five people meant every data point mattered, every decision had to be precise, and every dollar had to be tracked. That experience taught him the critical importance of having real-time access to performance data for his newsletter Stacked Marketer. When Stacked Marketer grew to 100,000 subscribers across three newsletters (Stacked Marketer, Psychology of Marketing, and [Tactics]) without him being a personal brand or influencer ("I'm not sure if 1% of our readers will be able to say that I am the founder"), he knew he needed the same level of data precision for his newsletter business and especially wanted accurate data about the marketing costs associated with growing it (e.g., his PPC campaigns). The $1,000 monthly problem Cinca's data analyst was doing exactly what he was paid to do: pulling reports from his marketing software, cleaning up UTM source data, and creating weekly and monthly reports using Google Data Studio and spreadsheets. Redacted campaign report provided by a human analyst The analyst would handle data inconsistencies, situations where some traffic sources used different field names, requiring manual association to ensure uniform reporting. To work around these limitations, they'd have to do manual exports, data manipulation, and pivot tables. The real problem wasn't just the manual work, it was the frequency. "If you have to put in too much work to get that data, you're less likely to be using the reporting as frequently. And this is possible from having this data always present and easily accessible rather than us having to dig through it all the time." Without easy access to this data, decisions got delayed. "As AI tools got better and better, I started thinking: can we have a way to just pull this data because it's right there through the API? Just have some code process this for us?" His reasoning was logical: "It's always the same. But Cinca needed more than just code; he needed to know how to run it. It was like, okay, so take these exact steps, set these things up, then take your code from here to there to use it." The AI provided step-by-step instructions for: Setting up hosting on Google Cloud Configuring the development environment Deploying the code Creating authentication systems "I can explain my problem, or explain even if I have a decent understanding of the steps that I have to take, I can even tell the steps that I want to take, and then my words will be turned into code by the AI." The initial result was the "Activate" stage, where Cinca was able to see his first prototype. "This is where AI proved invaluable beyond just writing code." Step 3: Learning to be precise Initially, Cinca approached AI like a human conversation. "You have to explain, 'look for this specific field.' It's better to repeat yourself and say too much rather than not enough." "It's better to repeat yourself and say too much rather than not enough." Step 4: Debugging with AI as his partner When code didn't work as expected, Cinca developed a systematic debugging approach with AI. "The most I've done in high school was just having your C environment with a compiler, and you just press compile, and then it just pops up your extra window with the mini app that you just wrote." Yet he successfully built: Backend system : Python scripts that pull data from the tool's API and process it Frontend interface : Web-based dashboard displaying data in interactive tables Authentication : Google Workspace integration so team members can log in automatically Hosting : Google Cloud deployment with subdomain setup "So it's kind of two parts. So there's the part of like a backend that pulls the data, processes some of it, and then that gets sent to the front end, which then does a little bit more processing to present it in a nice table." The authentication process to get access to the tool is particularly elegant: "If you use Google Workspace, which many people do, you can just put Goog