top of page
Search

This Isn’t About AI — It’s About Who Gets to Speak


The Moment We’re In

This is not a reactionary post. It is a statement of philosophy from the founder of High Noon about the direction of the industry we are building in.


The conversation around AI in creative industries has escalated quickly.


Policies are being announced. Lines are being drawn. Standards are being declared.


And increasingly, consequences are being delivered not for actions, but for opinions.


Recently, Ryan Dancey — former executive at Alderac Entertainment Group and a figure instrumental in shaping modern tabletop publishing — found himself out of work after posting his perspective on AI.


You may agree with him.

You may strongly disagree.


That is not the point.


The point is this:

We are now in a moment where expressing an analysis of technological change can carry professional consequences, even when that analysis is framed as prediction rather than policy.

That should concern everyone in this industry.


Not because disagreement is dangerous.


But because fear-driven silence is.


When conversations about emerging tools become socially policed instead of philosophically examined, industries do not grow stronger; they grow narrower.


And narrow industries do not nurture new creators. They gate them.


This is the landscape we are operating in.


And that is why this blog exists.


High Noon’s Flag

Before I defend anyone else’s right to think or build, I want to make my own position clear, and that of High Noon Game, Inc.


Philosophically, I believe AI is a tool.


No more moral than a drill.

No more mystical than a lathe.

No more useful in unskilled hands than any other instrument of craft.


Tools amplify ability.

They do not replace judgment.


For High Noon, our stance is simple:

AI is welcome in exploration, experimentation, and inspiration. But our consumer-facing products, the games people purchase and bring to their tables, will always be 100% human-created.

We will not replace human artists with AI for final paid work. Ever.


That is a boundary I can afford to draw.


And because I have drawn it publicly, I am accountable to it.


That is my line.


It may not be yours.


The Philosophical Error

The mistake being made right now is not fundamentally about AI. It is about enforcement.


There is a growing instinct in our industry to convert personal conviction into universal mandate and to socially punish those who do not immediately conform. That shift, more than the technology itself, is what concerns me.


A boundary is a line you draw for yourself. A barricade is a line you build to control others. Those are not the same thing, and confusing them leads to unnecessary fracture.


When I say, “High Noon will not use AI for final paid art,” I am drawing a boundary. I am defining the identity of my company. I am making a commitment to my customers. I am choosing the standard I will live under.


When someone says, “No one should be allowed to use AI or they are unethical,” that is no longer a boundary. That becomes a barricade. And barricades do not simply define identity; they enforce conformity.


Industries do not mature by erecting barricades around tools. They mature by developing coherent principles about how tools are used, disclosed, and integrated responsibly.


AI is not a moral agent. It is not a person. It is not an ideology. It is a tool, an amplifier of capacity. Like any amplifier, it can magnify skill or magnify incompetence. The ethical questions surrounding it are not about its existence but about intent, representation, and harm.


If a creator publicly promises one thing and secretly does another, that is a problem. If a creator misrepresents their work to their customers, that is a problem. If a company hides process in a way that violates trust, that is a problem.


But if a creator uses a tool within the boundaries of the law and produces a product that customers willingly evaluate, purchase, and enjoy, then we are not dealing with a moral transgression. We are dealing with a market decision.


Right now, we are conflating discomfort with wrongdoing.


And that conflation is dangerous.


Because once discomfort becomes grounds for professional destruction, innovation slows. Conversation narrows. Fear replaces inquiry. People stop asking questions not because they have clarity, but because they have risk.


An industry governed by fear is not principled. It is fragile.


And fragility is not strength.


Scale, Privilege, and Context

When major publishers speak, people listen.


Stonemaier Games published a clear and categorical headline in April 2024: “Generative AI? Not for Us!” 


Catalyst Game Labs issued its own firm policy rejecting generative AI in final products.


These statements are now being cited across forums and social media as case studies. As models. As proof that “the industry” has spoken.


Context matters here.


Stonemaier Games operates at a scale where paying hundreds of commissioned illustrations per title is not just aspirational, it is standard operating procedure. In the same post, Jamey Stegmaier openly acknowledges this reality, writing about the privilege of being able to fund large volumes of original art and contrasting that with a first-time designer self-publishing on a shoestring budget.


That acknowledgment is important. It is honest.

But the landscape has shifted since April 2024.


AI capabilities have accelerated. The cultural temperature has risen. And the “gray areas” he referenced (prototype art, economic constraints, value propositions for smaller creators) have largely disappeared from the public conversation.


Instead, what remains amplified is the categorical stance.


Meanwhile, Catalyst Game Labs has raised multimillion-dollar crowdfunding campaigns on the strength of an established intellectual property with decades of brand recognition. That scale provides margin. That margin provides insulation. Insulation allows for principled choices that may not be available to someone operating out of a garage with a $7,000 campaign goal, 2 jobs, 3 kids and a mortgage.


This is not criticism. It is structural reality.


There is a difference between drawing a line when you are financially insulated and drawing one when you are financially exposed.


When large publishers establish standards, those standards inevitably become social signals. And social signals, in heated climates, become informal requirements.


An indie creator scraping together $5,000 for a first print run does not operate in the same economic universe as a publisher managing eight-figure revenue cycles.


To pretend otherwise is to flatten reality.


And flattened reality is where good intentions begin to harm the very creators they claim to protect.


Where I Stand in This

High Noon’s first crowdfunding campaign raised roughly $120,000. Our follow-up expansion campaign raised another $40,000. In the indie space, that is considered a success.


I am grateful for it.


But let’s keep that number in perspective.


That is not a war chest.

That is not insulation.

That is not a marketing machine.


That is careful budgeting, disciplined spending, and an enormous amount of personal sacrifice.


We operate lean.

We operate small.

We personally attend conventions whenever we can afford the time and travel.

We rely on grassroots marketing, direct relationships, and a fiercely loyal community that believes in what we are building.


My family is involved. My friends are involved. Our supporters are deeply involved.


We are not a garage startup anymore, but we are still a skeleton crew.


And that scale matters.


Because the decisions we make about tools, hiring, art budgets, marketing spend, and risk tolerance are deeply personal. They are not abstract philosophical exercises. They are tied directly to whether the lights stay on.


For High Noon, I have drawn a line: our final paid products will remain 100% human-created.

That line is not free.

It means slower growth.

It means tighter margins.

It means saying no to efficiencies I know will become standard across the industry within the next five years.


I make that choice because I can...barely.


But I will not pretend that every creator is operating with even that margin.


For someone launching a first game on less than the cost of MOQ (Minimum Order Quantity), with a day job and a mortgage, the calculus looks different.


And pretending it doesn’t is not principled; it is detached.


The Conversation Is the Frontier

For my entire online life, my north star has been conversation. I don’t mean performative “dialogue” where everyone agrees and calls it unity. I mean real conversation, the kind that risks disagreement, forces clarity, and treats people like adults who can hear something they don’t like without reaching for a torch.


That instinct is older than High Noon. It’s how I’ve always operated. Sometimes I lead with a hot take, sometimes I lead with a question, but the goal is always the same:

open the floor, let ideas collide, and let truth survive the collision.

I believe in the marketplace of ideas, not because it’s always polite, but because it’s the only mechanism I’ve ever seen that reliably separates strong arguments from weak ones over time.


And that’s why the AI debate concerns me for reasons that go beyond art pipelines.


Because what I’m watching in parts of this industry isn’t a debate about tools. It’s a tightening of speech. It’s an environment where asking a question is treated like an admission of guilt. Where curiosity is framed as moral failure. Where creators, especially small creators, learn very quickly that the safest move is not to think out loud.


That is not a mature ecosystem. It’s a brittle one.


The Ryan Dancey situation is a flashing red warning light. Whether you agree with him or not is secondary. The signal is what matters: a respected veteran expressed a viewpoint on a fast-moving technology, and the consequence was professional exile. That doesn’t just punish one man; it teaches everyone else what the new rule is:

don’t talk about this unless you already agree with the loudest voices.

That’s the barricade.


Not AI itself.


The barricade is the shrinking permission to speak.


And here’s why that matters:

AI is coming either way.

The march of events doesn’t pause for our comfort. Within a few years, the tools will be good enough that many people won’t reliably tell what is AI-assisted and what isn’t unless it’s disclosed. The technology will become normal, because that is what technology does when it works.


So if we cannot discuss it openly now - while norms are still forming - then we are surrendering the future of the industry to fear, faction, and enforcement. We will not “protect creativity” that way. We will protect gatekeeping.


And gatekeeping doesn’t just keep bad actors out. It keeps fragile newcomers out. It keeps poor creators out. It keeps unconventional thinkers out. It keeps honest question-askers out.


That is how an industry narrows itself into stagnation.


I’m not writing this to demand that everyone adopt my boundary. I’ve already stated mine:

High Noon’s final paid products will remain 100% human-created.

But I am writing this to defend the space where people can disagree about tools without being treated like heretics and to defend small creators who don’t have the luxury of making their decisions inside an ivory tower.


If we want an industry that survives what’s coming, we have to restore something basic: the right to talk.


What I Refuse to Participate In

For years, the unspoken rule in parts of this industry was simple: keep your head down. Make your games. Don’t bring your personality into it. Don’t talk about religion. Don’t talk about politics. Don’t make waves.


And many people have succeeded that way.

Be a gray rock. Be neutral. Stay safe.

I learned early that speaking openly in this industry carries real consequences. When I spoke up in defense of open tables and open conversation, the consequences were real. And I accepted them, because I believed then, and still believe now, that conversation is healthier than enforced silence.


But something has shifted.


What began as “don’t bring unrelated controversy into board games” has quietly evolved into something broader:

Don’t question the consensus.

Don’t challenge the temperature of the room. Don’t ask uncomfortable questions about tools, economics, or future norms.


Now we are at a point where creators can be professionally punished not for misconduct, but for expressing an opinion about a technology that will materially affect how games are made.


That is not a small shift.


It means the silence is no longer about protecting the table from outside politics. It’s about protecting the table from internal debate.


And that is where I draw a line.


I refuse to participate in an environment where survival questions are treated as moral infractions.


I refuse to shame a small creator for exploring tools that might make the difference between launching a game and shelving it indefinitely.


I refuse to equate insulation with virtue.


And I refuse to accept that the only safe posture in a creative industry is quiet compliance.


If we cannot discuss the economic realities of making games, if we cannot ask how new tools affect accessibility, cost structures, and creative opportunity, then we are no longer defending art.


We are defending comfort.


A Call to Builders

This is not a call to abandon standards.

It is a call to build them in the open.


If you are a large publisher, hold your line. Just remember that your line was drawn from a position of insulation. Leave room for those still climbing.


If you are an indie creator, do not let fear silence honest questions. Explore responsibly. Disclose when necessary. Draw your own boundaries. But do not surrender your voice.


If you are a customer, reward integrity. Support craftsmanship. Recognize that the industry you love is built by creators operating at wildly different scales and under wildly different constraints.


Technology will evolve. Tools will improve. Positions will shift. That is inevitable.


What must not shift is our willingness to talk to one another.


An industry that can debate its future openly will survive it.

One that cannot will fracture under its own fear.


High Noon has drawn its line.

We believe in human artists. We believe in paying them well. We believe in craftsmanship and accountability.

And we also believe in conversation.


If you’ve read this far, I’m not interested in outrage. I’m interested in clarity.

So I’ll ask you directly:


  • Can an industry mature if certain topics are too dangerous to discuss?

  • Is there a difference between personal standards and enforceable barricades?

  • How should scale and economic reality factor into moral expectations?

  • What does responsible integration of AI look like five years from now?

  • And most importantly: how do we protect both human artists and fragile creators at the same time?


I don’t expect consensus.


I expect conversation.


Leave your thoughts below. Let’s build something stronger than silence.


The frontier has never belonged to the silent. It belongs to the builders, the ones willing to ask hard questions, plant their flags honestly, and ride into uncertain territory together.

 

 

Tags:

 
 
 

Comments


BE AN EXCLUSIVE MEMBER OF THE HIGH NOON POSSE AND RECEIVE  NEWS AND UPDATES TO YOUR EMAIL

Thanks for submitting!

©2021 by High Noon Game, Inc.

bottom of page