IT strategic plan?
Shadow IT-induced anarchy
In a cloud equivalent of the Wild West, shadow IT is a major cause of cloud anarchy facing enterprises today.
It’s not unusual for employees to become frustrated by the IT department’s seemingly slow progress and subscribe to a SaaS offering themselves without considering the impact this decision will have on the rest of the business.
Other problems arise when boards rush to embrace cloud without having defined a comprehensive vision and strategy that takes into account existing business processes, employees’ skills, company culture, and legacy IT infrastructure.
While a cloud-first approach might get off to a cracking start, without that clear company-wide vision and strategy, it is destined to lose momentum fast.
The chaotic environments resulting from these ad-hoc approaches have far-reaching consequences for an organisation’s corporate governance, purchasing, and IT service integration processes.
Good cloud governance
Where governance is concerned, it is unlikely there will be full visibility of what cloud services are being consumed where, and whether appropriate controls and standards are being met.
This problem is exaggerated in highly-regulated industries, such as financial services, where organisations are required to demonstrate they are: mitigating risk, managing IT security appropriately, managing audits and suppliers effectively, and putting appropriate controls in place to ensure compliance with regulations around data sovereignty and privacy such as the EU GDPR.
Financial services firms also need to demonstrate they are managing material outsource risks effectively, in order to comply with FCA regulations.
The uncontrolled purchase and use of SaaS or PaaS services without the appropriate level of IT engagement will also throw up a whole raft of integration, visibility and support headaches.
‘Technology snowflakes’ are another cause for concern. These occur when the same problem is being solved in different ways by different teams, which leads to IT support inefficiencies and additional costs.
Enterprises need to factor in some of the other financial implications of cloud anarchy too. These include a fragmented procurement process that make it difficult to cut the best deal, as well as questions over how teams consuming their own cloud services manage their budgets in the context of consumption-based services.
Embracing a cloud-shaped future
With a clear cloud strategy underpinned by appropriate controls, everyone will have the tools they need to innovate faster. The final piece in the puzzle is to ensure employees are fully engaged, and have the skills required to take advantage of this new approach and tools.
This requires building a company culture that embraces the cloud in a structured way, and promptly plugging any skills gaps in your employees’ knowledge.
With the Sex Pistols’ anthem still ringing in my ears, it occurs to me that Johnny Rotten was half right when he screamed the immortal lines: “Don’t know what I want, but I know how to get it”.
With cloud adoption, it’s important that everyone within the business pogos to the same tune – and that there is agreement up front on what is required.
Without a strong cloud vision and strategy, it’s impossible to know where you’re heading, how you’re going to get there, and when you’ve arrived.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Case in point: The Boston-based company in 2017 launched its Search with Photo feature allowing shoppers to use photos of desired items to find the same and/or similar products on its website. This visual search feature, although easy for consumers to use, needs a complex neural network to power it — one that’s robust and reliable, yet also capable of supporting such innovative functionality.
Delivering this AI-driven innovation falls to Wayfair’s chief architect, Ben Clark.
Clark joined Wayfair in 2011 as head of customer recommendations. In that position, he worked on the consumer-facing side. He became chief architect in 2014, and, as such, now has responsibility for operational engineering as well as storefront-side and infrastructure-side engineering. He oversees 250 staffers, nearly all of whom are systems engineers and programmers and who work in an environment that spans three colocation facilities and the Google cloud for a few specific workloads.
The company employees about 6,800 people total, which includes more than 1,200 engineers and data scientists.
Clark recently shared his insights on his position, the company culture, software tipping points and the pursuit of AI innovation.
What’s your role as chief architect?
Ben Clark: Chief architect is the most fluid of the commonplace chief titles in the industry. It can mean a wide variety of things at different companies, and I have responsibilities not frequently tied to that title. But it’s basically about architectural direction, road-mapping and constant probing for soundness of the ideas that we’re turning into living, breathing systems at Wayfair.
There are chief architects who are very controlling, but that’s not really my style. I think we get a lot of benefits from the bottom-up innovation that comes from all the members of our team. But we do need to get some focus and synergy and make sure all the different threads are coming together to participate in our strategic goals of delivering a terrific experience to all our customers.
Quick aside: IT often uses the word customers for multiple groups of end users, so which group do you mean?
Clark: I mean the people who are shopping on Wayfair. Sure, a lot goes on the back end of that. There are all kinds of things we need to do; it’s not just the [shopping] surface [Wayfair teams] interact with. But we try to make sure what everyone is doing is connected to the [consumer] customer. It might be indirect for some groups, more direct for others. If they don’t understand that, they might miss opportunities to keep their work aligned with that.
Can you explain more about how you achieve “architectural direction, road-mapping and constant probing”?
Clark: This is the way I think about it: As we grow bigger and bigger, our testing efforts become more and more important to everything we do — and we do them on a year-round basis. We run two sets of things: what you could call pressure tests and tests where you remove a piece of technology infrastructure to make sure your systems can handle a situation where a component becomes unavailable. That’s basically how we make sure that we don’t introduce a weakness into the things we’re building. If you don’t test this way, you can be blind to these things and they can bite you in unexpected ways.
Are the pressure tests and the tests when you remove a piece of infrastructure all about making sure the Wayfair systems are always on?
Clark: Yes, it is, but the tricky part of that is you can have an extremely available system where you don’t move very fast. If you don’t make any changes, you would be available, but that’s not what you want. The trick is to have a continuously available set of systems, for our customers and our internal people and partners, yet to keep up the pace of releasing software many times a day.
So high availability and innovation?
Wayfair advertises that it has a strong ‘manager-doer culture.’ How does that work in your organization?
Clark: There’s nothing mysterious about the term. It means we have an expectation that our managers can sort of ladder down into what their groups are doing, that they’re able to not just manage by numbers or their interpersonal relationships but also dig deep into the technology stack and everything going on in their groups.
Not all technologists make great managers, so how do you ensure yours make the transition?
Clark: If we’re going to do that manager-doer thing, we start with a pool of people who are excellent technologists. There are some who want to stay on a pure technology track and some who want to go into management over time. We don’t promote someone and expect that they’ll know how to do it. There’s a whole other set of skills that we take seriously, and we have a training program, a curriculum, and informal groups where they learn how to do that [for example] by learning to run a good staff meeting and raise their game in terms of both verbal and written communications.
Boston’s biggest tech company
Wayfair promotes itself as a ‘tech company that happens to sell furniture,’ and the company was recently named Boston’s biggest tech company. Why identify as such?
Clark: First and foremost, it’s very sort of fact-based; Wayfair has been an e-commerce company with no brick-and-mortar stores since 2002. It’s always been this kind of collaborative partnership between the business strategy side and tech side on an equal basis.
In terms of what else makes us a tech company, we have a very strong sense of continuous integration/continuous deployment, of releasing software many times a day and embracing these families of techniques that have become very widespread in recent years. We started doing these very early in our development as a company, and it’s really branded the kind of feel of the place.
I think it is an accurate descriptor and it is certainly true when we’re presenting ourselves to the group of people who might consider taking a job at Wayfair. We want to portray the way things are, because we do think it’s more appealing than if we were a different kind of company behind the scenes.
What kind of workers do you hire to fit your corporate culture?
Clark: We’re looking for people who really feed off of the impact that their code has, or the impact the other work they do will have, and so although we have some technology that’s very much in the forefront, that’s very hot right now, that’s kind of an outgrowth of the focus on impact.
So, we try to attract those people. We try to reward those people when they come to Wayfair and when what they do has a high impact. That’s kind of a virtuous circle: rewarding people for thinking about the work they do and thinking about the impact it will have on the lives of the customers and those at the company and the suppliers.
AI-driven innovation high on 2018 to-do list
What’s on your agenda for 2018?
Clark: We’re certainly going to continue to push the consumer-facing side of things in areas of visual search and AR/VR — augmented reality and virtual reality. But there’s a broad range of things we’re working on [that enable those areas]. And I’m going to continue to push the testing based-program both for capacity planning and soundness of the systems we’re standing up.
Is there a disruptive technology on the horizon you’re watching?
Clark: I already mentioned visual search, and you do that with a deep learning-based approach. That’s a very fast-moving area we’re taking good advantage of already and will continue to keep an eye on and use elements from the research community that become practical.
Speaking of AI-driven innovation and emerging technologies, how do you determine when you need an IT refresh?
Clark: First of all, we’re never done. Do we rest? Maybe the day after Cyber Monday we rest a little, for 24 hours, and then we’re ready to go for another year of intense activity.
If it’s a large effort, you have to be deliberate because any time you spend switching out, it’s time you’re not spending on high-value features. But there are plenty of times when we think about those factors and we make a decision to make a move.
How do you think about legacy in the enterprise?
Clark: I don’t use that word much or think about it that way; I don’t find it adds much to the conservations about technology. The choices you made in the past are either going to continue to line up well with the emerging problems you have or not. There’s some code at Wayfair that’s been here since 2002, and if it ain’t broke, we’re not going to fix it. But on the other hand, the wants and needs of our customers and the competitive landscape change so rapidly that we are constantly adding new things and making significant modifications to other things that have been working well for a long time.
Construction and is now president and CIO at New York-based consulting firm StarCIO, believes so. “The Kohler Konnect mirror was probably one of the more interesting voice assistants I looked at,” said Sacolick, who, like SearchCIO, monitored the event remotely. The message: AI voice assistants have gone mainstream.
Sacolick, author of Driving Digital: The Leader’s Guide to Business Transformation Through Technology, remembered when the smartphones debuted at CES were dismissed by peers as having little impact on IT strategies. “But, sure enough, people started bringing in smartphones, and you needed to worry about BYOD and putting in MDM [mobile device management] managers and thinking about policy.”
He sees the dominance of voice interfaces at CES 2018 as signaling another gearshift for CIOs, akin to the migration of data centers to the cloud and the move from web-only to mobile apps. “Now, CIOs are going to have go from mobile user experiences to voice UX and make sure the applications they build out have a voice capability.”
Where it makes sense, said Nigel Fenwick, Forrester Research principal analyst who focuses on CIO issues. “We’re not going to put a voice interface on everything, because there is cost and complexity associated with that, and the return is not necessarily going to be there,” he said. So CIOs “will want to be cautious” and use conversational interfaces where they have a “maximum impact.” Don’t tell that to the vendors: Amazon has a plan to put an Echo in every boardroom.
But Fenwick agreed the migration of AI voice assistants from the consumer market to the workplace is inevitable. CIOs will start seeing demand from millennials. And the technology will evolve from voice interfaces retrofitted on select enterprise applications to AI voice assistants working side by side with employees. “Teenagers growing up are going to be used to having that conversation with a device — and expecting an intelligent response,” he said.
Marriage of IoT and AI
Moreover, voice assistants — of the smart and not-so-smart variety — are just one component of an increasingly complex technology landscape CIOs now have to manage, Fenwick said, as companies like the ones presenting at CES this year outfit the world with a digital skin.
“The big thing for CIOs will be handling all the sensors that are going to be enabled through IoT platforms,” he said, adding that the ability to process and gain insight from internet-of-things data will “separate the winners from the losers” in the next few years. Voice assistant technology that allows users to communicate with IoT devices ups the ante.
“The role of the CIO at once becomes more complicated because of the need to integrate new technology with back-end systems of record,” as well as “understand what’s happening with the customer in order to create unique value for the customer,” Fenwick said.
The marriage of an AI interface and IoT at CES 2018 also struck Sacolick as a game-changer. Many of the CIOs he deals with in his consulting business see IoT devices as vehicles for collecting data, allowing companies “to be smarter about what’s happening out in the field.”
“But as soon as you start thinking about these devices as two-way — instead of just data-collection devices, they are presentation devices or intelligence devices, making decisions for people” — then questions about reliability, performance and analytics arise. How much computing, for example, takes place centrally in the cloud and how much locally?
“I do think, for enterprises, it’s still early,” he said, but noted that when you see AI chipmakers Intel and Nvidia battling it out at CES for supremacy in the autonomous vehicle space, it’s time to pay attention.
‘How, not whether’
Analyst Mike Ramsey, who covers connected vehicles for research outfit Gartner, said what struck him from this year’s huge focus on autonomous cars was a shift in emphasis. “The focus was on how this was going to work — How will we make money? How will the tech be deployed? — not whether the tech will work,” Ramsey said, waiting to board a flight home from the show.
A point of debate in the industry is the integration of virtual assistants, which Ramsey said come in two varieties: the AI voice assistant that can communicate your wishes to the world — order a pizza — and the more “deeply integrated” intelligence embedded in the controls of the car. Google, Amazon and Apple continue to make inroads on this front, but Ramsey said the industry’s embrace of the big tech companies is not universal or without reservations.
“Mercedes announced its own system that has a lot of capability, not just basic things like asking it to change your radio station or call mom, but weird questions, like ‘Can I wear flip-flops tomorrow?'” he said. The ongoing “tussle between the tech giants and the automakers,” he said, is less about who owns the data and more about brand.
“The issue for them is who owns the experience in the car? They don’t want you to get in and feel like what you love about your car is Alexa,” Ramsey said.
Forrester’s Fenwick had something to say about that.
“You see at CES a sort of shift that has happened over the last few years — and continued to accelerate this year — towards the individualization of product or consumer experience. And that reflects the ability of companies to greatly tailor the experience of the product or service to their customers’ needs and desires,” he said. It’s a challenge for brands — and for CIOs.
“How do you build a technology architecture that is flexible and adaptable, that can integrate as yet undeveloped technologies into the architecture quickly in order to create revenue?” he said.
Regulations will have an impact on your AI projects. Cybersecurity, data sovereignty and data transfer regulations “influence the space of AI,” Scriffignano said. Today, the General Data Protection Regulation (GDPR), which goes into effect in May, is top of mind for many organizations, and it includes a “right to explanation” mandate, a sticking point for machine learning models that operate inside a so-called black box. But soon enough, GDPR will be replaced by another regulation — and by new forms of malfeasance. “When there’s new regulation being contemplated, there are the people trying to comply with it and the people trying to figure out how to get around it,” he said.
3. Cybercriminals will use AI to sharpen cyberattacks.Scriffignano encouraged CIOs to be on the lookout for “the amalgamation of AI and cyber anything.” Take botnet attacks, for instance. “They’re pretty stupid right now,” he said, and require going out and searching for connected devices with static IP addresses. “I’m not the cyber guy, but I’m pretty sure that if these botnet attacks started to use flocking and swarming algorithms and started learning from their mistakes and redirecting their efforts based on how they’re failing, they could succeed a lot quicker,” he said. “That’s terrifying.”
4. The IoT-AI relationship is a double-edged sword. The internet of things (IoT) is largely a network of things connected to the internet. “These things don’t have a very good ability to discover each other autonomously and have a conversation about what each of them does and how they might help each other,” Scriffignano said. But thanks to AI, the ability for devices to communicate with and even learn from each other is coming, which will present new ways to delight customers but also new security and data privacy obstacles for CIOs. “We’ve got to be smarter about how these things connect to each other,” he said, “and how the unintended consequences of them connecting to each other without us controlling those connections might cause mysterious things to happen.”
5. Cognitive computing is not your run-of-the-mill AI. CIOs should think of cognitive computing as a new field of AI. “The idea behind cognitive computing is an AI agent who works alongside human experts and advises the expert and watches the degree to which the advice is taken or not taken,” he said, “and modifies its advice accordingly.” But today, cognitive is often used as a synonym for AI or slapped on to processes — like your autocorrect function — that don’t rise to the level of cognitive computing. Scriffignano urges CIOs to pay attention to how cognitive develops in 2018 and not use it as catch-all term for AI.
6. Collaborate without commoditizing. Most CIOs won’t be building an end-to-end AI solution, Scriffignano said. Instead, they’ll use platforms and services that can provide functionality to the business, creating a new kind of partnership between organizations. Adding AI to the mix will “challenge the ways in which we collaborate because it will form new connections that are sort of accidental,” he said. “If everything is formed as a service and ubiquitous and discoverable, then there’s a tendency for things to get commoditized. So, we have to figure out how to collaborate without commoditizing.”
7. Talent will continue to be a challenge. New skills are required to operate today’s data-driven companies, a need that’s exacerbated by AI. That doesn’t mean CIOs can stop hiring for the skills they used to hire for — data curators, analysts, modelers statisticians and methodologists are still needed in the modern enterprise. “But, increasingly, we need governance experts and problem formulators and detectives and visionaries and storytellers and diplomats,” Scriffignano said. “So, the stakes have gone way up.”
8. Watch out for autonomous AI. Autonomy means an AI agent is disconnected from a human or human-made mechanism that is telling it what to do. “Every AI agent has a goal, something it’s trying to achieve,” Scriffignano said. “If it’s autonomous and the environment changes, it needs to have the ability to modify its goal” to be successful. But AI autonomy could lead companies into dangerous territory.
“‘Yes, we’ve proved the concept!’ But all they’ve proved is that the technology works. What they haven’t proved is whether there is a business case for automation and will it deliver the scale of improvements the company wants to achieve,” he said. Rather than a POC, companies should insist on POV — proof of value — before embarking on RPA. “That’s the bigger challenge.”
Brain is co-founder and COO of Symphony Ventures Ltd., a consulting, implementation and managed services firm specializing in what the firm dubs “future of work technologies”– RPA technology among them. Founded three years ago, the firm has worked on RPA projects across a broad range of industries and geographies. “We’ve done deployments in five continents so far,” he said.
All work is local
The firm’s projects have also covered a diverse set of business processes. That’s because RPA is not a “process-specific solution,” Brain stressed, but rather the automation of rules-based, manual work not covered by a company’s process-specific technology systems. And that work necessarily varies from company to company.
“You can have five organizations and they each could be running the same ERP system, but the way in which these systems are configured depends on the particular company’s rules and that means there is different work that falls out manually,” Brain said.
At some companies, Symphony experts are called upon to automate the current manual process, using RPA technology to automate the work the same way employees do it. Other companies will want help on optimizing the process first before automating it.
“It really depends on what is driving the business decision,” Brain said. The nature of the work Symphony automates is always rules-based, but those rules can be extremely complex. (The firm has done projects in which it’s taken several months to capture and learn the processes that are eventually automated.)
Proof of value: Five steps
But, whether the RPA work is of the “lift-automate-shift” or “lift-shift-automate” variety, or involves simple or complex rules, companies need to follow certain steps in order to get a “proof of value.” Here is a synopsis of Brain’s five steps for deploying RPA technology:
- Scope the transformation
“RPA is a transformational tool, not a desktop macro builder. Look for pain points within the organization and identify what needs to change. This isn’t just a cost play; rather, it has to do with mitigating the challenges of growing in a linear fashion by increasing the number of full-time employees. For some, it is about improving speed and quality to differentiate in the market. Others are attracted by the insight and analytics that come from consolidating all transactional data into one database for real-time visibility.”
- Capture, map, measure
“The next step is to analyze the business and map processes at keystroke level. To do so, use experts in RPA, as it is important to drill into the areas where configuration will be complex. Standard operating procedures, training materials and system manuals will be great inputs, but not enough by themselves. Have the RPA experts sit with the process experts to map what really happens; afterwards, it will be easier to plot costs and service levels to the processes as a baseline.”
- Analyze and design
“With the scope defined and mapped, identify processes and parts of processes most suitable for automation. Then calculate the time and cost to implement these, as well as the benefits of doing so. Design a target operating model (TOM), which is a graphical depiction of the business structure and processes affected by the RPA implementation; it should detail everything from stakeholders to the applications/systems used by the automation. It’s important to map not just the RPA portions but also the scope of the business to determine how to redeploy resources to drive greater business value.”
- Plan and forecast the journey
“Consider all that is involved in the transformation and don’t underestimate the time required for change management and benefits realization. Create the implementation plan and financial model by looking at the savings and the cost avoidance that this transformation will bring over an estimated three years. Consider the cost of not only implementing RPA but maintaining the solution and updating it to take on additional tasks as needed.
Under the Privacy Shield, if a company wants to transfer personal data outside of the European Union, it must be deemed to provide “adequate” privacy protection by certifying to the Commerce Department that the company complies with the Privacy ShieldPrinciples. More than 2,400 companies were certified under the Privacy Shield in the first year after it was launched, according to the European Commission’s (EC) first annual report on the framework. That number is greater than all the companies that participated in the Safe Harbor agreement during the final 10 years it was in place.
Why did the Federal Trade Commission charge three U.S. companies with not complying with the EU-U.S. Privacy Shield Framework?
In three separate cases announced in September 2017, the Federal Trade Commission (FTC) alleged that Decusoft LLC, Md7 LLC, and Tru Communication Inc. made false claims about participating in the EU-U.S. Privacy Shield. Specifically, the companies misrepresented their status in regard to the certification process, according to the FTC.
The privacy policies on Decusoft’s website included statements that the company had certified its compliance when it had not. Decusoft is a New Jersey-based business that develops software for human resources applications. Although the company had initiated its Privacy Shield certification application, it did not complete all of the necessary steps.
Md7’s website included the statement that the company “complies with the EU-U.S. Privacy Shield Framework as set forth by the U.S. Department of Commerce regarding the collection, use, and retention of personal information from Individual Customers in the European Union member countries.” However, the California-based company, which works with the wireless industry to manage cellphone tower sites, had not received its Privacy Shield certification.
The website of Tru Communication — a California-based printing company also known as TCPrinting.net — stated that the company “will remain compliant and current with Privacy Shield at all times” when it had not completed the certification.
What is the significance of the FTC’s first three Privacy Shield enforcement actions?
The cases brought against Decusoft, Md7 and Tru Communication were the first actions the Federal Trade Commission took against false claims regarding the Privacy Shield. Earlier in the year, a number of European regulators and privacy advocates had expressed concern about the U.S. government’s commitment to the privacy framework. In July, Human Rights Watch and Amnesty International warned that U.S. surveillance laws and programs are so broad and poorly safeguarded that they render the Privacy Shield invalid.
The FTC announced its first three enforcement actions about one week before European officials and U.S. government officials met in Washington for the first annual joint review of the Privacy Shield Framework.
How were the U.S. companies penalized for the Privacy Shield charges initiated by the Federal Trade Commission?
Decusoft, Md7 and Tru Communication agreed to settle the charges brought by the FTC. By agreeing to proposed settlement orders, the companies did not have to admit any guilt. The orders ban the companies from misrepresenting their compliance with any privacy program sponsored by a government, self-regulatory or standard-setting organization. The orders also set out a number of reporting and notification requirements. If a final consent order is violated, a civil penalty of up to $40,654 could be imposed on the violating company.
Are privacy advocates satisfied with the efficacy of the framework?
Several civil liberties organizations, including Amnesty International, Human Rights Watch and the American Civil Liberties Union, have voiced concern that the Privacy Shield does not sufficiently protect Europeans’ data privacy. In a joint letter to the European Commission on July 26, 2017, Human Rights Watch and Amnesty International called for the framework to be re-evaluated, arguing that U.S. protection of personal data is not equivalent to that guaranteed within the European Union.
The groups called on Europe to encourage the U.S. government to adopt binding reforms to comply with the EU’s Charter of Fundamental Rights. The groups maintain that current protections fall short of EU standards, especially when U.S. foreign intelligence surveillance laws and programs are considered.
Human Rights Watch: U.S. surveillance techniques render Privacy Shield invalid
Privacy Shield draft draws criticism from data protection advocates
Is the Privacy Shield working?
The European Commission issued its first annual report on the Privacy Shield Framework on Oct. 18, 2017, a little over a month after the Federal Trade Commission publicized its enforcement actions against three U.S.-based companies.
According to the report, the United States has stepped up its procedures for handling data privacy complaints and enforcement. It also said that the Privacy Shield certification process is working well. Nonetheless, the commission called for greater compliance monitoring, recommending that the Department of Commerce conduct regular searches for companies that make false claims about their participation. It also recommended more cooperation between the Commerce Department, the FTC and the EU Data Protection Authorities.
EC Commissioner for Justice, Consumers and Gender Equality Věra Jourová stated in the report that the framework is “a living arrangement that both the EU and U.S. must actively monitor to ensure we keep guard over our high data protection standards.”
The report also recommended that the U.S. administration make a permanent appointment to the position of Privacy Shield ombudsperson as soon as possible.