As the technological art of the possible expands, so must the canvas of ethics
In 2020—and beyond—when customers do not approve a business’ use of technology and data, they will vote with their data and their wallets.
Just because we can, should we?
This is a question that businesses (and governments) the world over should ask themselves when it comes to how they use technology. Without a doubt, the potential for the good offered by technology is huge—from helping optimize the production and distribution of agricultural produce, to discovering new medical treatments, and even spreading the benefits of better educational opportunities.
The potential for questionable applications of technology is equally verdant. We have already seen the emergence of deep fakes, automated and "as a service" cybercrime, and alleged democratic process interference.
At an enterprise level, problems start with attitudes toward the use of data and analytics technologies. This is a technology and cultural problem that requires investment in technology to secure and govern data access and comply with regulatory standards. But this is just the beginning. Simply complying with relevant law is not sufficient for many reasons; stakeholders’ trust in a business’ handling of data will become a key factor in their success. Public awareness of the volume of data collected—and what is done with that data—is growing, powered by news of breaches and questionable practices. As a result, customers will vote with their data and their wallets in 2020 and beyond.
But customers are not the only stakeholders; employees are placing greater emphasis on how their employers act to the use of technology and data.
|RELATED: Learn more about data portability requests, here →|
Building a culture of data trust is a critical addition to regulatory compliance and requires the ethical use of technology to be placed firmly at the heart of corporate governance and culture. Everyone working with technology and data should apply this simple but effective test: If the project appeared on the front page of a major publication, would your company, its customers, and stockholders be okay with that? Just because something is technically possible and not expressly prohibited does not mean that it will be met with approval.
Just because something is technically possible and not expressly prohibited does not mean that it will be met with approval.
Data chaos is fast approaching
2020 will see the reemergence of data virtualization and federation as solutions to the spaghetti data landscape.
A handful of years ago, big data was all the rage. While the hype around it may have ebbed, the pace at which new data is created—and the range of options to store and use it—has not. From big data platforms spawned by the rise of Hadoop, through a much greater range of public cloud data store options, the data landscape is fragmenting—both in terms of sources and their location. Being a data-driven business is a priority for many, further whetting corporate appetites for data. As a result, the user audience and use cases for data are also expanding.
Supporting these expectations creates a creaking data architecture, which for many organizations was forged in a different era, one defined by “small” data—traditional enterprise applications churning out largely predictable transactional data. Considerable resources were invested in these architectures, and rather than discard these investments and start over, big data storage was simply tacked on.
The cost of on-premises data storage, particularly enterprise data warehouses, is high. As more cloud-based storage became available, costs decreased, while spawning more points on the data map. With more people demanding access to data to drive business decisions—against a backdrop of a tightening regulatory landscape—the requirements placed on that data architecture are increasing. Traditional data governance approaches cannot keep up. As a result, data is at best hard to find, and at worst, at risk. The data landscape is grinding toward unmanageable, unavailable chaos while generating serious potential risks.
Technology is a big part of the answer to this chaotic situation, and an old concept needs to be dusted off and updated. Data virtualization and federation will rise up the agenda in 2020, with one important difference—to date, these have been overly technical concepts. While the underlying capabilities are complex, it is important to highlight the business benefits of the approach. By adopting the virtualization and federation approach of using abstraction layers that encompass heterogenous data stores, regardless of location, data can be both freed for use and—with governance and privacy baked in—have many risks eroded. At G2, we expect to see renewed focus on this approach, powered by the desire to use data more widely and effectively while ensuring it is secured and governed.
Technology is scaling beyond human management
2020 will mark the joining of centralized technology management with distributed intelligence to manage IT everywhere.
Technology moves fast, and it is accelerating. And it isn't simply processing power that is on the rise. Everything that surrounds and connects it is speeding up, too: billions of internet-connected devices, 5G networks rolling out globally, masses of new data, and large-scale public cloud data centers coming online. This technology unveils endless positive and nefarious use cases, a picture completed by the rise in automation including robotic process automation (RPA) and AI-powered capabilities such as machine learning and its sibling deep learning. Not to mention the automation of frequent cybercrimes such as distributed denial of service attacks: machines are managing (and attacking) machines. Information technology has scaled out and up to the point that it is beyond traditional management approaches, and is certainly outside the scope of direct human management.
2020 will see the rise of automated management of technology solutions to address this situation. This trend has already rumbled into life this year with a growing range of management tools—for example, enterprise monitoring, aimed at managing IT assets across on-premises, public, and hybrid cloud environments—offering a single pane of glass to manage assets regardless of their location. Tools that encourage visibility like this are the foundation; enhancing them with intelligence that goes beyond automatic monitoring and escalation by delivering the capability to fix issues that once required human intervention is the next step. However, it will not just be all-encompassing management solutions, other trends like AIOps will help bake automation in at a micro scale, complementing the top-down of management tools with the bottom-up of automation by design. As the network of technology continues to expand to the edge, the intelligence to manage it must be distributed, too—connecting centralized and edge assets and all points along the way.
Advances in technology often create a double-edged sword: They make new use cases possible paired with benefits such as greater flexibility and lower cost of operation. At the same time, these advances create new problems and risks to be overcome, from complexity of management through privacy and other ethical dilemmas.
The options available to enterprises today—and the pace at which IT moves—is creating digital agility never before possible. In doing so, however, it has advanced beyond human ability to manage directly. As the art of the possible expands with automation technologies, enterprises must use them to augment human control—helping to tame the digital landscape.