Problem solve Get help with specific problems with your technologies, process and projects.

Closing code won't save you

Security expert James Turnbull explains why equating safety with hiding your source code is a fallacy.

How can exposing source code be good for security? Doesn't it follow that if there's less information available for attackers, then apps and systems should be safer?

What you are describing is called within the security world as 'security by obscurity'. The principle is one that many government and intelligence organizations operate by, the NSA being a prime example of that mindset. If no one knows about your potential vulnerabilities or weaknesses, then you are safer.

The big thing about 'security through obscurity' that concerns me is that very assumption -- that no one knows about your vulnerabilities or weaknesses. I think this assumption is a fallacy. First, reverse engineering code has become easier and easier with automated tools available that perform the function for you. Secondly, a large percentage of attacks and compromises are launched internally by employees, who are often privy to large amounts of information about an application or system. Closing the source code does not protect you from these internal perpetrators.

Finally, crackers have proven more than adequate at comprising closed operating systems and applications (Microsoft Windows, for example). Therefore, I don't think the obscurification of source code provides any greater security.

Dig Deeper on Linux servers

Have a question for an expert?

Please add a title for your question

Get answers from a TechTarget expert on whatever's puzzling you.

You will be able to add details on the next page.

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.