Security in any corporation is a long, drawn out balancing act. On one side, you have the security teams, who quite reasonably want to prevent anything unpleasant from happening to the company. They aim for a least-privilege model, where individuals have only the access required to do their job and nothing more.
On the other side, you have developers and users trying to get stuff done. They want flexibility, the freedom to build, test and promote as fast as possible. Their focus is on outcomes, whether that is generating revenue, delivering features, or improving performance.
Taken to extremes, neither approach works.
If you enforce only the bare minimum access across the board, you cripple progress. Teams slow down, and the company begins to feel rigid and bureaucratic. On the other side, leaving everything open in the name of speed is simply inviting trouble, particularly in a world where security threats are constant and evolving. The long term risks of that approach are difficult to nail down but often expensive to recover from.
So there has to be balance. In theory, this is where corporate standards come in. A need is justified from a business perspective, weighed against security and legal requirements, and a standard is formed that reflects that compromise.
In practice, however, the phrase “corporate standards” often is just used to justify an arbitrary standard.
To someone trying to deliver something new, it can sound like a closed door. It can imply that decisions have already been made, that they should not be questioned, and that the reasoning behind them is either unavailable or not open for discussion. When that happens, people do what people always do. They work around it.
Developers, in particular, tend to be driven and resourceful. If they are blocked without explanation, they will look for alternative routes to deliver what they are being paid for. That is how shadow IT appears, not through malice, but through frustration and a desire to deliver.
At the same time, it is not realistic to expose every detail behind a corporate standard. Some of those decisions are based on known vulnerabilities, internal limitations, or security considerations that should not be widely shared.
This is where a more pragmatic approach helps.
When denying a request or limiting access, it is often worth providing a brief, sensible explanation. Not a full breakdown of every risk, but enough context to show that the decision is considered and grounded in real constraints.
This does a few useful things. It treats the other party as a professional rather than an obstacle. It demonstrates that the standard is understood, not just followed blindly. And importantly, it gives people something to work with.
With even a small amount of context, a developer or business owner can frame a proper justification for change. They can ask for something specific rather than excessive. In many cases, they will request far less access than they initially thought they needed, simply because they understand the boundaries they are working with.
Corporate standards should not be a brick wall. They should be a guide that reflects the current balance between risk and progress. And like any balance, they need to be visible enough for people to work with, even if not every detail can be shared.
Get that right, and you deliver to the business without weakening security. Get it wrong, and people will find their own way around it anyway.