Meet our 2025 speakers: Jess & Eda on Forgetting Machines: AI Coding Tools and Skill
We are welcoming Jess Rose and Eda Eren to talk about "Forgetting Machines: AI Coding Tools and Skill" at FFConf 2025.
We wanted to also thank both Jess and Eda for taking the time to answer our questions so we can all get to know him a little bit more before 14th November.
About Jess & Eda and their talk

- Title: Forgetting Machines: AI Coding Tools and Skill
- About the talk: AI coding tools are getting sold as an inevitable future, but using them diminishes the processes that drive learning for developers. What's this doing to individual skills and our industry as a whole? Is it hopeless? What can we do to fix this?
- Jess' origin story: I woke up one day and realized that programming languages are like human languages, but with jobs that paid better and wouldn't make me get a PhD so here I am.
- Eda's origin story: I had an epiphany once that technology is not supposed to be a curse, and programming can be seen as an art form, not just pure science. (Also, that there's always lots to learn, so that's what I decided to do!)
The warm up questions
What's the most gloriously unproductive rabbit hole you've ever gone down in the name of "learning more"?
Jess:
I once spent every free hour for more than a month learning about traditional sago production methods in Eastern Indonesia, because the idea that people we creative enough to make tree pulp into food was inescapably fascinating to me.
Eda:
Oh, that happens a lot! I don't remember a particular case because there are so many.
Usually what happens is that when I list every single resource about something I want to learn more about, I continue with each one's own list of resources, and the next one after that, and so on, ending up with thousands of bookmarks…that are not necessarily about the very first thing that I was trying to learn more about.
When tech starts to feel like a curse again, what do you do to lift the spell?
Jess:
Teach! It's so hard to stay pessimistic when I'm confronted with the abundance of hope and enthusiasm learners bring to working with the web.
Eda:
It seems that nowadays tech feels like a curse even more, but in moments like these, I try to remember that there are still a lot of people building amazing things, things that help human flourishing, and at the end, tech is pretty great if you let it and make use of it wisely.
I also try to remember that there is still so much to discover and potential to make good things.
About the work and the talk
You've framed your talk around "forgetting machines." What do you think developers are most at risk of forgetting first - and what might we never get back if we lean too hard on AI?
Jess:
What I'm most worried about is that I see many of my learners and peers jump right to asking an AI for help or support, before stopping to consider the question themselves. Losing this processing step risks turning a creative problem solving process into a series of trying to glue a series of unconsidered components together.
Eda:
I think what happens with relying too much on AI tools is that we might lose a certain kind of agency (maybe an ironic word to use?) over what we control and understand. Or, in other words, we might lose the intentionality that goes into building something and the joy of figuring things out for ourselves at the end of a long frustration. I think there is value in that.
There's also the idea of developing the right kinds of instincts to solve problems that turn out to be more complex than we might have foreseen, so there's a risk of losing that skill too.
Do you see parallels with other moments in computing history where tools seemed to erode skills, but actually changed the industry for the better?
Jess:
The parallel I see most often isn't actually from programming, but in weaving (which I might argue is a physical form of programming). The Luddites protested more efficient textile automation, citing that they would undercut the labour power of skilled workers.
The textile automation stuck around and did result in increased efficiency, while also lowering the pay and working conditions of textile workers as the Luddites feared.
This may sound like I'm championing a rejection of automation, but this and our AI-hype-filled present situation both make me wary that productivity improvements that concentrate power in the hands of capital aren't improving an industry, if the improvements come at direct cost to the workers who drive value.
Eda:
I think when higher-level languages first started to be used, people who were working closer to the machine might have thought that there was a loss of skills with these new abstractions, but it turned out not to be the case, and I think the new abstractions made programming slightly more accessible to newcomers.
There is some debate about whether it's the case with AI coding tools right now, but I don't think there's a new abstraction layer with them, so it doesn't really mirror that situation.
Also, maybe not directly related to computing, but I guess Socrates was said to dislike writing because he thought it would make memorizing obsolete, and people would stop thinking if they wrote things down! Well, I would say that the opposite happened.
What's something you've unlearned - willingly or not - because a tool started doing it for you?
Jess:
Oh gosh, that's such a good question. The dangerous thing is, I wouldn't really know! The slow erosion of skills and abstraction of automation means that the impacts on us are often invisible until the skills are called upon again.
Eda:
Navigation is the number one thing that comes to mind. "Finding your way" is always tough (not just metaphorically!), but I think before relying a lot on navigation apps, I used to pay more attention and take more notice of landmarks, now I think the same level of attention isn't there.
Find out more about Jess online on their website, BlueSky and Mastodon.
You can see more of Eda and their work on their website and BlueSky
Join us in November to see their talk: 2025.ffconf.org