As the GDPR go-live date approaches in late May, we’ve heard some buzz as companies examine their data to see if they’ll be in line with the new EU requirements. It’s such an interconnected global world now that most businesses, even American ones, need to be ready for any GDPR fallout. Fines will be stiff for noncompliance. For most companies, the questions to answer for GDPR readiness are who will manage compliance, and where the processes will reside inside the business. A brute-force approach won’t really work—GDPR can’t be met just with infrastructure changes or another layer of security. Since GDPR is about compliance, not simply security, it’s not as easy as putting it in the security team’s lap. It also requires a dedicated data protection officer, so there will be people management involved, not just systems changes. The categories of data affected by GDPR regulations will affect most departments at an enterprise.
You’ll want to know which applications your company is using for GDPR compliance, and know generally which apps are in use in your organization to avoid SaaS sprawl. SaaS sprawl grew out of the application sprawl of the pre-cloud days, and it’s all too easy now for IT teams to lose track of which departments are using which cloud applications. There are a couple of approaches that can help; first, using BI tools can give you the data to know which applications are actually helping the business’s bottom line. Multicloud management may also be useful for avoiding or cutting down on SaaS sprawl to assist with data integration, and API expertise will also come in handy. Lots of communication and preparation around app deployment in the cloud can also cut down on duplicate efforts and potential sprawl.
There are so many clouds, and so many cool things to do with them. Of course, each public cloud you adopt will cost the business money. Managing these multicloud costs is now a big part of IT’s job, so either keeping them down or justifying them is essential. A recent TechTarget survey found that for 42% of respondents, controlling cloud cost was their top IT ops priority. There are tips here on avoiding costs up front: one is to take each instance of an application in the cloud as its own app, then link each of them using a VPN. Automation and good policies can also help with app integration issues with cloud. Another preventative cloud cost measure is to understand how your network providers charge for traffic, then evaluate your workflows to see how and where applications are traveling over those networks. You often get what you pay for in IT, so if using multicloud for redundancy purposes will bring a lot of business value, do the math of the costs and fight for that budget.
Wherefore open source in this era of cloud? Open source software’s free labor business model isn’t providing developers the rewards they used to get, namely a sense of community and satisfaction. Now, cloud providers can find and incorporate good open source code easily into their products and sell those products to customers. This brings up questions about the obligations of big cloud providers to open source developers, and the ramifications of open source becoming mainstream and monetized in this way. It’ll be interesting to see how this plays out. One possibility is that large tech companies will direct open-source projects themselves, though that would make it harder for startups or individual developers to get air time for their ideas. Another idea is to create a new kind of open source license so that developers get compensated, or use a subscription service that would pay developers.