logoalt Hacker News

pardsyesterday at 11:20 AM28 repliesview on HN

In my large enterprise world, AI adoption hasn't made it outside of the development teams - only developers have access to Github Copilot.

Code takes 6-12 months to make it from commit to production. Development speed was never the bottleneck; it's all the other processes that take time: infra provisioning, testing, sign-offs, change management, deployment scheduling etc.

AI makes these post-development bottlenecks worse. Changes are now piling up at the door waiting to get on a release train.

Large enterprises need to learn how to ship software faster if they want to lock in ROI on their token spend. Unshipped code is a liability, not an asset.


Replies

SlinkyOnStairsyesterday at 12:13 PM

> Development speed was never the bottleneck; it's all the other processes that take time: infra provisioning, testing, sign-offs, change management, deployment scheduling etc.

So much of Management (both mid and executive) still considers Software as if it were an assembly line; "We make software just like how Ford makes cars". Code as a product.

Which isn't to say that most software development isn't woefully inefficient, but the important bits aren't even considered. "The Work" is seen as being writing code, not the research that goes into knowing what code has to be written.

And for AI marketing, this is almost a videogame-esque weakspot. Microsoft proclaims "50% faster code!" and every management fool thinks "50% faster product; 50% faster money!"

> Large enterprises need to learn how to ship software faster if they want to lock in ROI on their token spend.

It's going to be a disaster once ROI is demanded. Right now everyone is fine with not measuring it; Investors are drunk on hype and nobody within the company actually wants to admit that properly measuring software development productivity is almost impossible.

But the hype won't last forever. Sooner or later investors will see the "$2M spend" and demand "$4M net profit", and that's not going to materialize.

Copilot and Claude won't be tackling the real bottlenecks. They're not going to dredge up decade old institutional knowledge, they won't figure out whether code looks bad because it is bad or because it solves a specific undocumented problem, they won't anticipate future uses.

Code just isn't the product. Not the real work. Really, if your codebase is in a healthy state, it's often a literally free output of the design and research processes. By the time you've refined "our procurement team finds the search hard to use" into a practical ticket, the React component for the appropriate search filters has basically already been written, writing up the code is just a short formality. Asking Copilot would turn a 10 minute job into a 5 minute job. Real impressive, were it not for the 6 hours of meetings and phone calls that went into it.

show 4 replies
embedding-shapeyesterday at 11:27 AM

> Large enterprises need to learn how to ship software faster

They haven't even learned that "less code is better" yet, I wouldn't hold my breathe waiting for them to suddenly learn "more advanced" things like that before they learn the basics.

show 4 replies
_pdp_yesterday at 11:30 AM

Yep.

I would argue that any sufficiently large system reaches a point where more code is in fact the opposite of what it needs.

Nutrition and calories are only useful up-to a point and then we have diminishing and later on negative returns.

Even-tough it is not the best analogy because we are describing two different system, it helps put a mental model around the fact that churning more is often less.

Side Note: A got a feedback from a customer today that while our documentation is complete and very detailed, they find it to be too overwhelming. It turns out having a few bullet points to get the idea across it better than 5 page document. Now it is obvious.

show 2 replies
thisisityesterday at 3:52 PM

My larger enterprise world today AI adoption seems to have taken a turn for the worse.

Finance folks reached out asking if they could vibe code their own app using Copilot/Cursor/Claude for finance planning purpose. And because they know my management freezes whenever there are whispers of "our CFO said so" they even paraded that reasoning - "our CFO "tested" Lovable and he is convinced and asking us to vibe code the app".

If that is not enough they ended with a nicely wrapped reasoning of "we need to try this to be sure that using vibe coded app can exist in enterprise finance with appropriate data security and maintainability".

And mind you this is a reasoning at a company with more than 20+ billion in revenue.

show 3 replies
nazcanyesterday at 4:49 PM

It may not be the biggest bottleneck, but if you can have a similar amount of time, but reduce the number of engineers by 30%, that's a huge win.

And having less people involved means there is much less communication and alignment.

Not to say it's a panacea.

show 1 reply
giancarlostoroyesterday at 3:27 PM

> Unshipped code is a liability, not an asset.

"Hey we found this bug and-"

"We already found it with Claude, we're still waiting for out next release."

Or worse, its a bug that doesn't exist in prod, since the code keeps changing, and you wont know about the bug until its out there because there's one niche user with a niche use scenario everyone forgot or didn't even know existed, and he's going to somehow crash the entire system with your next deployment.

dgellowyesterday at 1:28 PM

The Mythical Man Month should really be a mandatory reading for anyone working in software… and I don’t mean reading a Claude summary

show 1 reply
mattmcknightyesterday at 12:13 PM

"release train" ... "learn how to ship software faster"

SAFe is poison.

show 1 reply
kj4211cashyesterday at 12:34 PM

We have a "two timelines" approach going on and I'm curious if others are seeing the same. There are official "Engineering-supported" services. There development speed is not the bottleneck. Engineers demand clean requirements that take forever to show up. Testing and deployment scheduling also take forever post-development. Important people are so fed-up that they've started hiring people to vibe code and develop services without going through Engineering. Code is shipped much faster here but technical debt accumulates rapidly. The important people are beginning to hire Data Scientists who sit outside of the Tech org to manage the AI code. It's all very interesting.

show 1 reply
Mashimoyesterday at 1:48 PM

Same here, but instead of the developers having access to Github Copilot, some selected few devs have access to some internal proxy, that goes to Amazon bedrock, where we have "400 request" per week to Claude Sonet :))))

ge96yesterday at 4:20 PM

> Code takes 6-12 months to make it from commit to production.

That seems wild, niche/highly specialized field?

show 4 replies
dev360yesterday at 12:56 PM

Its trickling in slowly to non dev teams. Im consulting with a large fin-tech on Enterprise-wide AI adoption at the moment, and I'm seeing the same parallels though: you have power users that reap disproportionate rewards from it, and then you have the "tab complete" crowd that copy paste things into the prompt.

This was a huge motivation behind me trying to design an AI automation platform that comes "batteries included". I also think a lot of orgs, even engineering orgs do not know how to configure basic things like Claude plugin repositories into their installs.

agloe_dreamsyesterday at 6:03 PM

Meanwhile in private equity world, they have realized that code is "piling up from 10Xing everyone's performance" and as a result they have solved it by just firing all QA, focusing only on speed to prod, killing signoffs, and scheduling and code review. We are probably going to bankrupt ourselves from an idiotic mistake somewhere here. But nobody will ever know until it happens. Don't take those gates for granted.

chrisss395yesterday at 12:34 PM

It's good to know your experience mirrors mine. Developers are moving faster, but the rest of the organization is holding them back because processes and decisions still rely on other parts of the org. Has anyone else observed the same?

Organizations "born in AI" appear to buck this trend for obvious reasons (no legacy org. to deal with). My two cents.

show 2 replies
wildrhythmsyesterday at 1:52 PM

But then how will all of the know-nothing management types get their fingers in the pie?

razodactylyesterday at 11:50 AM

Especially when it waits a month and all the effort is either irrelevant or incompatible with latest changes that finally got through. So much token wastage to top off the recent chaos. Hopefully it improves just as fast as it materialised.

TrackerFFyesterday at 11:43 AM

Which is why there's currently a gold rush of "Enterprise AI" startups which implement / offer agents to enterprise businesses.

butlikeyesterday at 2:20 PM

Unshipped code can't break. How is it a liability and not an asset? The money maker is making money and the changes that potentially would interfere with that are held up at the gate. Seems like a good thing from a business perspective.

show 3 replies
atoavtoday at 5:44 AM

I don't know who needs to hear this, but: shipped code ia a liability too.

Shipped code can fail. Shipped code can corrupt or delete production data. Shipped code can break your customers trust. Shipped code can lead to literal court cases. Shipped code can force hasty fixes and all nighters.

Some people can afford to yolo their code others can't. Maybe me having programmed for expensive hardware that punches a hole into a brick wall, burns up in a cloud of smoke or hurts people if you make a mistake has to do with the fact that I know.

The LLM won't go to jail for certain things. The one who signed of on the code may.

perarnengyesterday at 8:48 PM

This will eventually bubble up and get exposed and then things will start to roll fast.

ericmceryesterday at 5:41 PM

That's enterprise tech also?

I wonder what adoption is like at older non-tech companies.

The office next to mine is being used to teach a bunch of 20-30 year olds how to be insurance brokers using a powerpoint presentation. Copy paste the presentation into an LLM and you just replaced them all. It feels like... things might be kind of dark in 10-20 years if we just keep barreling down this road.

show 2 replies
__loamyesterday at 10:04 PM

Every line of code is a liability

kakacikyesterday at 1:55 PM

Do you work in my company? :)

I kept saying this since Day 1 of llms - even 99% of development reduction means almost nothing in our company in speed of delivery of whole projects. And we are introducing generator of code that semi-randomly has poor performance when they have perf bottlenecks and fills the codebase with... sometimes questionable solutions. Sure, one has to check the results all the time, but then time is spent on code reviews, not much less than actual (way more fulfilling, rewarding and career-boosting) development.

Now I understand there are many more scenarios where gains are more realistic and sometimes huge, but it certainly ain't my current working place. So I use it sparingly to not atrophy my skillset but work estimates are so far the same and nobody questions that.

themafiayesterday at 6:55 PM

> lock in ROI

On code you cannot possibly copyright. Yea they're all on the verge of "Locking in."

reactordevyesterday at 1:27 PM

Sounds like the typical ServiceNow paralysis. The “Mother May I” model.

nonameiguessyesterday at 9:23 PM

There are so many elements to this. I've worked in nearly every part of software orgs. Development, ops, professional services, pre-sales. There are bottlenecks everywhere. Faster shipping gets you nothing if your customers procurement budgets don't increase and they're not buying anything. You can wow new customers and lure them in with shit you get out the door super quick that appears to work but falls flat after six months of usage, but you damage your reputation in the long run. So how do you guarantee your software will actually work after six months of usage? You have to test it by running it yourself for six months. There is no other way. No automated suite can exercise every single possible customer use case over a long period of time. It's a combinatorial problem.

Just yesterday I was in a meeting with a customer asking if we could make our FOSS virtualization platform work such that if you yank the root disk out of a server and put it in another one, everything will work with no hiccups. Well, provided it's exactly the same model and you're going to put it on the same network with all the same IP assignments, you've got a shot. I've actually tried to do this before for the hell of it and I only needed to account for the MAC addresses of the NICs being different, as long as you have no other drives and everything else is exactly the same. I'm sure I could whip up something that scans for the predictable interface name and changes the old MAC stored in the NetworkManager configuration files (and wherever else they might happen to be) and change them to the newly discovered one before making a DHCP request, and maybe that will work, but how certain can I really be? I can test on servers I have and I don't have every possible combination of data center equipment all of our customers have. There is no feasible way to test every possibility. Having an LLM whip up the code for me instead of writing it myself doesn't change that.

Ironically enough, that customer is making software for another customer and their own requirement is that it has to run on very hardware on an airplane, which they don't have. So they're working on little NUC clusters in their cubes and at their houses instead, because their company doesn't have extra true server racks for them to use and no budget to acquire them, which probably won't change any time soon given the spike in hardware prices. They're all using AI but what good is it doing? They're spinning their wheels because they're targeting a runtime environment that doesn't exist that they can't test on.

It's a weird folly of the Internet age that the largest companies in the software world are all web companies. Mostly, they're media companies in disguise. Their only real product is human attention and they sell it to advertisers. Tech is just the vehicle that allows them to deliver it. We've valorized their "ship as fast as possible" ethic, which maybe matters, maybe doesn't, but it was never the source of their value. Nobody spends ad money on Facebook and Google because of the quality or delivery speed of their software. It's the human users and data they've captured, which to be clear, software plays a huge role in, but it's not a model all software companies can follow. We don't earn revenue from half braindead doomscrollers wasting most of their day with a background drip of vaguely dopamine-boosting noise blasting into their senses while they leak every fact about their lives to media companies. Our customers have to make intentional decisions to spend money out of finite budgets.

There's another story on the frontpage right now of Coinbase laying off a bunch of its employees and using AI to write more code. Okay, great, but the best that can do is reduce labor expense. They only earn more revenue if consumers decide they want to buy more Crypto and hold it in Coinbase. If Coinbase is using AI to write their software, so is everyone else, so that doesn't give them any kind of edge on quality or shipping speed. Their success is going to be determined overwhelmingly by whether or not people want to buy crypto, a broad market trend completely out of their control. No one in any business ever wants to admit this, but we're all at the mercy of these broader trends.

People are all over this thread citing Ford. Ford didn't decline because they couldn't ship fast enough. They declined because the market stopped wanting what they were making except their full-size pickups, and it's largely just Americans that want that. I don't blame them or think they did anything wrong exactly. People love to do these post-mortems contemplating a world in which someone like Ford accurately predicts every single shift in consumer sentiment that will ever happens and always stays ahead of the curve. It'll never happen. Everything that goes into style eventually goes out of style, and your ability to ship out of style shit faster won't help you.

You said you work for a bank and I'm honestly curious. What causes a customer to choose your bank over another? Do you think it has anything to do with software features? I'm lucky I even got a meeting with the customer I was with yesterday. He told me he loves our product and fought hard for it over a chief architect who wanted something else and made them do a long comparison study to prove our product met their needs better. Why did that chief architect prefer the other product? He plays golf with their CTO.

redsocksfan45yesterday at 1:38 PM

[dead]