AI

    Vibing a wardriving visualizer

    A while ago someone was showing off their LEGO creation in my social media feed: a brick-built QR-code containing the credentials to their WiFi network. People rushed to tell them that they shouldn’t share this info publicly, but they appeared unconcerned, for who would actually even know where their network was physically located, right? Can’t do any harm to it if you can’t find it. But could you? Surely there are databases of networks available online?

    It turns out there are indeed services like WiGLE that catalogue WiFi access points, Bluetooth devices, and cellphone towers around the world. It relies on people using their Android app and submitting the found networks to the site’s database. Out of curiosity I installed the app and did a bike ride of some 25 km around the city, and much to my surprise logged a whopping 5000 WiFi networks and some 10000 Bluetooth devices! It was a revealing moment when I stopped on a bridge over a highway and looked at the app, and it showed Bluetooth devices with names like Audi, BMW, and Toyota. Almost all new cars can be seen as Bluetooth devices!

    Now, I had collected some sample of networks, but how would I view it? The WiGLE app itself does not provide a map view, so I did what anyone would do these days and vibe-coded my own app. I started by typing a stream of thoughts to ChatGPT: I want it to be a web page, not an app you need to install, and I want it to work fully in the user’s browser, not uploading the data to any servers. The app should provide a map with my route and the network observations, along with some playback controls, etc. Chad then turned all this into a proper requirements document that I handed out to the ChatGPT Codex coding agent after creating an empty GitHub repository for the project. Codex crunched the assignment for five minutes and burped out a pull request. I then set GitHub up to automatically publish the project in GitHub Pages and merged the PR, and lo and behold, it actually worked immediately! After exporting my observations out of the WiGLE app I could follow my route and see all the observations on the map.

    Read More →

    Random Things Sunday #15: Vibe coding

    This week’s main topic is vibe coding, but there’s an aasinsilta,, “donkey’s bridge” as we say in Finnish, to it first:

    • Vincent Ritter put together neat API documentation for Micro.blog at microblog.dev. I’ve been working on my own client so this came in at a very good time. I’ll report on my new client some time in the future.
    • The new client I’ve been working on has been vibe-coded, which brings me to this new site for the Outcome Engineering Manifesto. It lays down some very sensible ground rules for deliberate coding with agents, emphasizing techniques such as measuring, prioritizing, and risk management.
    • Matt Web calls for better discoverability of vibe-coded apps which may be hyper-specific to the creators' needs. I think that’s a great point, for I certainly wouldn’t have started creating my own blog client without agents, but now I can customize my writing experience just the way I want. It would indeed also be just nice to get a picture of what people are building, by following a single source.
    • Finally, the comic relief: OpenAI reports how their models became overly fascinated with goblins. 👺

    Bluesky bot to report speeding buses

    The people in my neighborhood Facebook group are often worried about cars speeding on the main road that goes through the area. While I can’t do much about that, I realized that I can at least monitor how fast the buses are going and make that visible, thanks to a realtime high-frequency positioning API provided by HSL that runs the public transport system here. This was around June 2020. I implemented a simple Java application to monitor those buses on the street, but the MQTT library I used proved unreliable, so I put the project to slow backburner. My idea was to eventually make it into a Twitter bot, but I never got that far back then.

    Fast forward a couple of years to September 2022, two months before ChatGPT was launched. TypeScript was all the rage back then but I had not had the opportunity to use it at work yet, so it clicked to me that I could retry my old bus speed tracker with TypeScript.

    Read More →

    Using AI at work

    Someone young not in the software business recently asked me if I use AI in my work. My knee-jerk reaction was “no”: I don’t trust the AI agents running amok in my codebase. I’ve tried it, and it often feels like herding cats, unless your prompt is perfect and you managed to take everything into account, which isn’t possible. But then I started backing off, as there are of course nuances to this: it’s not all-in or nothing.

    Read More →

    Random Things Sunday #10

    Some random findings from the internets, this time catering to programmer-minded people:

    • My former colleague Robert turned out to be quite a penman. He wrote a cool short sci-fi story called Null and void and published it in his blog. Check it out!
    • Darwin Awards collects events where people have removed themselves from the gene pool by doing something stupid. The history of the site goes back to the 1990s, so you may have heard about it already, but this year they’ve got a new category for AI Darwin Awards. This new category honours the visionaries who looked at AI and thought “You know what this needs? Less safety testing and more venture capital!".
    • Finally, you thought you knew what an email address looks like? The E-mail.wtf quiz is here to prove you wrong! I scored 14/21…

    A light exercise in vibe coding

    A few months ago I tried vibe coding an ant colony simulator, but in the end it didn’t work out. I did mention that I had had successes in vibe coding, and lately I had one of those again.

    For my previous post I needed to embed several YouTube videos. That, however, is very cumbersome when you’re blogging with your phone, as the mobile version of YouTube does not have the option to easily just copy the embed code, unlike the desktop version, so it was about to turn into an exercise in frustration.

    But lo and behold, ChatGPT to the rescue. I gave it this prompt:

    Read More →

    Are you really using AI?

    Yesterday I was scrolling through my feed when I came across a short video by Allie K. Miller, who’s something of a thought leader in the AI business space. She mentioned that you probably use AI already, but asked if you’re really using it? AI can enable so many novel use cases that merely enhancing your old processes is kind of wasting its full potential.

    It made me stop and think. Like many, I’ve been using AI tools here and there - asking ChatGPT to help with writing, generating some ideas, maybe summarizing content. But am I actually reimagining how I work?

    Allie mentioned how after 10 years of creating AI content (7 of those without AI tools), she completely rebuilt her workflow. Now she dictates to Otter AI while walking, runs her content through multiple AI systems, and formats everything in Beehiiv with auto-links. More steps, but apparently 80% time saved. That’s not just enhancement.

    I wonder if most of us are in that “scratching the surface” phase with AI that Allie mentions. We’re asking AI to help with tasks we already do, rather than rethinking what tasks we should be doing in the first place.

    I’m obviously not using AI to its fullest potential either. My current workflows still look pretty much like they did a couple of years ago, with some AI assistance sprinkled in rather than built around them. There’s something to think about there.

    (By the way, I don’t often subscribe to newsletters, but I do follow Allie’s work. If any of this piqued your interest, you might want to check out her newsletter at AI with Allie.)

    Vibe coding an ant colony simulator

    When it comes to household chores, ironing is one of my favorites as it often allows me to shorten my “Watch later” playlist (771 videos there as of writing this…) in YouTube. Last time I did ironing I watched this cool video of an ant colony simulator:

    The fun thing is that it creates fascinating emergent behavior from a few simple rules. The ants walk around randomly, leaving behind a trail of pheromones like breadcrumbs pointing back home, and when they encounter food, they pick some up and follow the pheromone trail back to the nest. While carrying food they leave behind another type of pheromone that point the ants to the food source.

    This seemed like an interesting exercise in vibe coding where you just describe your need in your own words and let an AI do the actual programming for you. Vibe coding seems to split the opinions sharply: some find that it makes coding more equal and available to anyone, others loathe the idea of an influx of AI-generated trash.

    Read More →