The dynamic type in C# provides flexibility that is not available in other statically-typed languages. Since its introduction in C# 4.0 (.NET 4.5), we have worked with customers who wanted to know more about how dynamic types are impacted by the obfuscation process.
Malicious actors — like any thieves — live by a simple rule: If the front door is locked, break the window.
It’s why threats like fileless malware and crypto-jacking have seen substantial gains over last few years. It’s why — despite increasing employee education and IT training — hackers are still hooking phish by developing more sophisticated and authentic-looking email spoofs. Cybercriminal communities, meanwhile, continue to grow on the dark web, allowing attackers to share info, purchase exploit kits and identify potential targets.
What does this mean for CISOs? That typical defense efforts are being outpaced as familiar attack vectors are replaced with non-traditional threats. But it’s not all bad news; here are three questions every CISO needs to ask to help close the doors, bolt the windows and leave hackers out in the cold.
Gabriel, you have been in the security industry for over 2 decades. You have seen many different tools and services. Why create a company around something as specific as obfuscation and in-app protection?
Our customers build a lot of really innovative apps that enable their users and customers to do new and cool things. These apps frequently run on untrusted client computers/devices and they control access to customer’s sensitive data or critical devices.
And after all the effort of designing, building, debugging, and deploying their applications, the last thing they want is for an attacker to steal their work or use it to look for vulnerabilities to break into their system.
Gartner calls In-App Protection “crucial” in their July 2019 Market Guide for In-App Protection In-App Protection. The guide’s summary advises security and risk management leaders to “take due care in protecting their application clients” in order to avoid “security failure.”
This raises the question – what constitutes “due care?” Obviously, no development organization looks to recklessly expose their applications or sensitive data to attack or compromise. On the other hand, over-engineered (or poorly engineered) security controls can quickly lead to excessive development costs, performance and quality issues, and, ultimately, unacceptable user experiences. While terms and terminology may vary, there is broad consensus on how to best define “due care” for any given application/user scenario.
PreEmptive has partnered with Microsoft for 16 years and we’ve been involved with .NET since before version 1.0 shipped. From that perspective, it’s hard for us to pinpoint a time where the energy and enthusiasm in the .NET community has been higher than it is now. Over the past few years, Microsoft has put together the annual .NET Conf, completely online with viewers, participants, and local events all over the world. The event is virtual and streamed live over three days.
The big news at this year's .NET Conf was the launch of .NET Core 3, the next step in the continued evolution of .NET. .NET Core 3 is a huge effort, with many moving parts, and Microsoft announced many related releases to support it. Highlights for us included:
Search for lockpicking and you’ll see that there’s no shortage of suppliers ready to serve locksmiths and hobbyists, each community having a perfectly legitimate need. Is there any reason to believe that burglars don’t shop the same sites?
Hackers are developers and they have a long history of enthusiastically embracing and adapting development (and DevOps) innovations to speed their work, extend their reach, and ship software – the only difference is that they’re more likely to use those development tools and platforms on YOUR software rather than on their own. Static code analysis tools, debuggers, and even public bug tracking databases are all go-to hacker resources.
Can you tell the difference? Exception or the norm?
Of course, everyone is “for security” in principle. The hard question that each organization has to answer for themselves is “how much is enough?” Over-engineering is (by definition) excessive, and over-engineering application security can, in fact, be devastating as overly-complex algorithms, architectures and processes can compromise user experience, degrade performance and slow development velocity. On the other hand, punishment is swift for organizations that cut corners and do not effectively secure their applications, their data and, most importantly their users and business stakeholders. Finding and maintaining that balance can be time consuming and, because you can never be sure you’ve gotten it exactly right, it can also be a thankless job.