Photo of Charles Gutjahr

Charles Gutjahr

Melbourne, Australia

March 2026

Photo of Charles Gutjahr

Charles Gutjahr

A short opinion

Not a day for using AI

There are more important things to be said about Atlassian firing 10% of staff today, but I note a concerning small detail: that AI seems to have been used to prepare Atlassian's memo to staff.

AI should not have been used here.

I realise that there are a variety of opinions about AI and what it is appropriate for, and that some people think using AI here is fine. We won't all agree on what the right level of AI use is but hopefully we can all agree that the level is neither never nor always. There are times when AI may be used, and times when it should not.

Then the question is what are the times when AI should not be used? I think that sacking people is one such time. I am confident that Mike Cannon-Brookes genuinely cares about how Atlassian staff feel about the firings and wants to communicate the reasons honestly and sincerely to staff. He, and his Chief of Staff Amy Glancey, and other good people at Atlassian would surely have put in a lot of work and thought into the memo given to staff and the media. There is a part of them in that memo.

And yet the memo comes across as having the shallow and generic style of AI writing. It has the stylistic AI tells of unnecessarily short sentences, the three-item lists, the repetition (ironically on 'decisiveness'), and the more impersonal passive voice. It also has the formatting AI tells of boldfacing individual words, and increased em dash use. If you don't see these yourself then I suggest you read the memo from Atlassian's first big round of firings published on 6 March 2023. It says mostly the same things, and yet it feels so much more human than the 2026 memo.

It matters that this memo feels like a machine generated it. Telling someone they are fired is one of the most human things we can do in business. I've had to fire people and so I know how difficult and emotional it is; if I have to do it then I want to show my staff that I take it seriously. I wouldn't risk an AI having any involvement in the process—not in the decision-making nor in the communication of the decision—because that feels cruel and inhuman.

And what do you get from running an AI over your writing? In the case of firing thousands of staff you've already spent days or weeks fretting over what to say, so you're saving precious little time by getting an AI to rewrite it. For that negligible benefit you scrub the humanity from your writing at precisely the time when it is needed the most. That's a terrible tradeoff that no-one should take; I wouldn't, Atlassian shouldn't have.

For Atlassian I think it matters more than most. The tech industry is all-in on AI right now, investing historically unprecedented amounts of money into AI, and Atlassian is trying to position itself as a major player there. Meanwhile the general public is not buying it—in either sense of the word. The general sentiment towards AI is negative across a range of surveys, and though there are some AI products which are popular there is also a notable reluctance from people to actually pay for them. It's clear that some AI products have value and a place in our future, but it's not clear that people are willing to pay enough money to cover the immense cost of all this AI investment.

If the AI industry is to avoid collapse it should acknowledge the reality that it is not entirely welcome. It should show us that it understands and respects that AI can't be used everywhere, focussing on doing the things we accept while drawing strong red lines delineating where AI is simply unacceptable. There has been some recent debate over whether Anthropic's red lines of using AI for mass surveillance and fully autonomous weapons are appropriate, but in my opinion those are way, way over the red lines that most people on this planet actually have. People don't want to see AI replace all art and music. People don't want AI to have power over their health and safety. And people don't want to be fired by an AI. Atlassian and the wider tech industry should demonstrate that they understand AI must not be used everywhere, and that includes steering clear of AI when firing staff.

© 2025 Charles Gutjahr