2018 Top 10 Business Intelligence Trends

Introduction The pace and evolution of business intelligence solutions means what’s working now may need refining tomorrow. From natural language processing to the rise in data insurance, we interviewed customers and Tableau staff to identify the 10 impactful trends you will be talking about in 2018. Whether you’re a data rockstar or an IT hero or an executive building your BI empire, these trends emphasize strategic priorities that could help take your organization to the next level.

 

 

1 Don’t Fear AI How Machine Learning Will Enhance the Analyst
Popular culture is fueling a dystopian view of what machine learning can do. But while research and technology continue to improve, machine learning is rapidly becoming a valuable supplement for the analyst. In fact, machine learning is the ultimate assistant to the analyst.
Imagine needing to quickly look at the impact of a price change on a given product. To do this, you would run a linear regression on your data. Before Excel, R or Tableau, you had to do this all manually and the process took hours. Thanks to machine learning, you can now see the product’s consumption in a matter of minutes, if not seconds. As an analyst, you don’t need to do that heavy lifting, and you can move onto the next question—were the higher consumption months due to an extrinsic factor such as a holiday? Was there a new release? Was there news coverage influencing product purchase or awareness? What you’re not thinking about is how you wish you could have spent more time perfecting your regression model.
Machine learning’s potential to aid an analyst is undeniable, but it’s critical to recognize that it should be embraced when there are clearly defined outcomes. “Machine learning is not great when your data is subjective,” says Andrew Vigneault, Staff Product Manager with Tableau. For example, when conducting a survey to customers about product satisfaction, ML cannot always pick up on qualitative words.
Additionally, the analyst needs to understand success metrics for the data to make sense of it in a way that is actionable. In other words, inputs into a machine don’t make the outputs meaningful. Only a human can understand if the right amount of context has been applied—which means that machine learning cannot be done in isolation (without an understanding of the model and what inputs/outputs are being made).

“Machine Learning helps you look under lots and lots of rocks when you need assistance getting an answer.” — RYAN ATALLAH, STAFF SOFTWARE ENGINEER

IDC forecasts revenues from AI and machine learning systems to total $46 billion by 2020.
In 2020, AI will become a positive net job motivator, creating 2.3 million jobs while eliminating only 1.8 million jobs. (Gartner)
While there might be concern over being replaced, machine learning will actually supercharge analysts and make them more efficient, more precise, and more impactful to the business. Instead of fearing machine learning technology, embrace the opportunities it presents.

 

2 Liberal Arts Impact The Human Impact of Liberal Arts in the Analytics Industry
As the analytics industry continues to seek skilled data workers, and organizations look to elevate their analytics team, we may have had a plethora of talent at our fingertips all along. We are familiar with how art and storytelling has helped influence the data analytics industry. That doesn’t come as a surprise. What comes as a surprise is how the technical aspects of creating an analytical dashboard, previously reserved for IT and power users, is being taken over by users who understand the art of storytelling—a skill set primarily coming from the liberal arts. Furthermore, organizations are placing a higher value on hiring workers who can use data and insights to affect change and drive transformation through art and persuasion, not only on the analytics itself.
As technology platforms become easier to use, the focus on tech specialties decreases. Everyone can play with data without needing to have the deep technical skills once required. This is where people with broader skills, including the liberal arts, come into the fold and drive impact where industries and organizations have a data worker shortage. As more organizations focus on data analytics as a business priority, these liberal arts data stewards will help companies realize that empowering their workforce is a competitive advantage.
Not only do we see a broad-base appeal to help hire a new generation of data-workers, we’ve also observed several instances where technology-based companies were led or heavily impacted by founders with a liberal arts education. This includes founders and executives from Slack, LinkedIn, PayPal, Pinterest and several other high-performing technology companies.

“It takes a certain amount of skill to build a dashboard and to do analysis, but there’s something that you can’t teach—and that’s the way you tell a story with the data.” — JENNY RICHARDS, DATA ARTIST, TABLEAU

Liberal arts grads are joining the tech workforce 10% more rapidly than technical grads. (LinkedIn)
One third of all Fortune 500 CEOs have liberal arts degrees. (Fast Company)
One powerful example of bringing in the liberal arts to a technology company comes from Scott Hartley’s recent book, “the Fuzzy and the Techie.” Nissan hired a PhD anthropologist Melissa Cefkin to lead the company’s research into human-machine interaction, and specifically the interaction between self-driving cars and humans. The technology behind self-driving vehicles has come a long way, but still faces hurdles when mixed human-machine environments persist. Using a four-way stop as an example, humans typically analyze situations on a case-by-case basis, making it nearly impossible to teach a machine. To help combat this scenario, Cefkin was tasked with leveraging her anthropology background to identify patterns in human behavior that can better teach these self-driving cars the patterns that humans follow, and in turn, communicate those back to the human riding in the car.
As analytics evolves to be more art and less science, the focus has shifted from simply delivering the data to crafting data-driven stories that inevitably lead to decisions. Organizations are embracing data at a much larger scale than ever before and the natural progression means more of an emphasis on storytelling and shaping data. The golden age of data storytelling is upon us and somewhere within your organization is a data storyteller waiting to uncover your next major insight.
Liberal Arts Tech
10%

3 The NLP Promise The Promise of Natural Language Processing
2018 will see natural language processing (NLP) grow in prevalence, sophistication, and ubiquity. As developers and engineers continue to refine their understanding of NLP, the integration of it into unrealized areas will also grow. The rising popularity of Amazon Alexa, Google Home, and Microsoft Cortana have nurtured people’s expectations that they can speak to their software and it will understand what to do. For example, by stating a command, “Alexa, play ‘Yellow Submarine’,” the Beatles’ hit plays in your kitchen while making dinner. This same concept is also being applied to data, making it easier for everyone to ask questions and analyze the data they have at hand.
Gartner predicts by 2020 that 50 percent of analytical queries will be generated via search, NLP or voice. This means that suddenly it will be much easier for the CEO on the go to quickly ask his mobile device to tell him: “Total sales by customers who purchased staples in New York,” then filter to “orders in the last 30 days,” and then group by “project owner’s department.” Or, your child’s school principal could ask: “What was the average score of students this year,” then filter to “students in 8th grade,” and group by “teacher’s subject.” NLP will empower people to ask more nuanced questions of data and receive relevant answers that lead to better everyday insights and decisions.

“[NLP] can open analysts’ eyes a little bit and gives them some self-assurance and some confidence in what they’re able to do.”  — BRIAN ELROD, DATA ANALYTICS LEAD, MORTGAGE INVESTORS GROUP

By 2019, 75% of workers whose daily tasks involve the use of enterprise applications will have access to intelligent personal assistants to augment their skills and expertise. (IDC)
By 2021, more than 50% of enterprises will be spending more per annum on bots and chatbot creation than traditional mobile app development. (Gartner)
Simultaneously, developers and engineers will make great strides in learning and understanding how people use NLP. They will examine how people ask questions, ranging from instant gratification (“which product had the most sales?”) to exploration (“I don’t know what my data can tell me—how’s my department doing?”). As Ryan Atallah, Staff Software Engineer for Tableau, notes, “This behavior is very much tied to the context in which the question is being asked.” If the end user is on their mobile, they are more likely to ask a question that generates instant gratification, whereas, if they are sitting at a desk looking at a dashboard, they’re probably looking to explore and examine a deeper question.
The biggest analytics gains will come from understanding the diverse workflows that NLP can augment. As Vidya Setlur, Staff Software Engineer with Tableau also puts it, “Ambiguity is a hard problem,” so understanding workflows becomes more important than the input of a specific question. When there are multiple ways of asking the same question of the data (e.g. “What sales rep had the most sales this quarter?” or “Who had the most sales this quarter?”), the end user doesn’t wish to think about the “right” way to ask it, they just want the answer.
Consequently, the opportunity will arise not from placing NLP in every situation, but making it available in the right workflows so it becomes second nature to the person using it.

4 The Debate for Multi-Cloud The Debate for Multi-Cloud Rages On
If your organization is exploring and evaluating a multi-cloud strategy in 2018, you’re not alone.
“There’s a stampede of organizations moving their data to the cloud and moving their core applications,” said Chief Product Officer Francois Ajenstat. “And whether it’s a ‘lift and shift’ or a re-platforming, we see customers adopting the cloud at a much faster rate than ever.”
According to a recent Gartner study, “a multi-cloud strategy will become the common strategy for 70 percent of enterprises by 2019, up from less than 10 percent today.” Customers are growing sensitive about being locked into a single legacy software solution that doesn’t match their future needs. However, switch and migrations have become relatively easier with similar APIs and the use of open standards like Linux, Postgres, MySQL, and others.
It’s likely your organization is also evaluating how data centers are designed and run. Your IT department is evaluating hosting environments based on risk, complexity, speed and cost—all factors that increase the difficulty in finding one, single solution for your organization’s needs.
Evaluating and implementing a multi-cloud environment can help determine who provides the best performance and support for your situation. According to the Boston Herald, GE re-aligned its cloud hosting strategy to leverage both Microsoft Azure and Amazon Web Services, with the intention to understand the best performing hosting environment and see which contract provides the lowest cost to pass to their customers.

“This multi-cloud or hybrid cloud strategy is becoming increasingly important to help reduce risk and provide more choice and flexibility for customers.” — FRANCOIS AJENSTAT, CHIEF PRODUCT OFFICER, TABLEAU

70% of enterprises will be implementing a multi-cloud strategy by 2019. (Gartner)
74% of Tech Chief Financial Officers say cloud computing will have the most measurable impact on their business in 2017. (Forbes)
But the multi-cloud trend doesn’t come without a healthy awareness of the merits and challenges of moving to this type of environment. While flexibility is a plus, a multi-cloud environment increases overhead cost from splitting your organization’s workloads across multiple providers. And a multi-cloud environment forces an internal developer team to learn multiple platforms and have additional governance processes in place, depending on the different environments they have to support.
Additionally, a multi-cloud strategy could potentially diminish the buying power of a company or organization. If a company is splitting what they buy across multiple providers, it will hurt their volume discounts. This creates a model where a company is buying less at a worse price.
Surveys and stats, such as the Gartner data-point above, indicate multi-cloud adoption is on the rise. However, it doesn’t indicate how much of a given platform was adopted. In many multi-cloud cases, organizations are using one provider for most of their needs and very little for others. But most of these use cases fall on implementing a second cloud hosting environment as a backup in case of incompetency or failure of the main cloud hosting environment.
While the rise of multi-cloud adoption in 2018 is on the rise, organizations will have to maneuver through the nuance of assessing whether their strategy measures how much of each cloud platform was adopted, internal usage, and the workload demands and implementation costs.

5 Rise of the CDO Rise of the Chief Data Officer
Data and analytics are becoming core to every organization. That is undebatable. As organizations evolve, they’re prioritizing a new level of strategic focus and accountability regarding their analytics.
Historically, most business intelligence efforts were assigned to the Chief Information Officer (CIO), who oversaw standardizing, consolidating, and governing data assets across the organization, which needed consistent reporting. This put BI initiatives (data governance, building analytical models, etc.) in competition with other strategic initiatives (such as IT architecture, system security, or network strategy) under the purview of the CIO—and often inhibited the success and impact of BI.
In some cases, a gap between the CIO and the business has formed due to speed to insight versus security and governance of the data. So to derive actionable insights from data through analytics investments, organizations are increasingly realizing the need for accountability in the C-Suite to create a culture of analytics. For a growing number of organizations, the answer is appointing a Chief Data Officer (CDO) or Chief Analytics Officer (CAO) to lead business process change, overcome cultural barriers, and communicate the value of analytics at all levels of the organization. This allows the CIO to have a more strategic focus on things such as data security.
The fact that CDO’s and/or CAO’s are being appointed and assigned accountability for business impact and improved outcomes, also demonstrates the strategic value of data and analytics in modern organizations. There is now a proactive conversation at the C-level about how to deploy an analytics strategy. Instead of waiting for requests for a particular report, CDO’s are asking, “How can we anticipate or quickly adapt to business requests?”

“My job is to bring tools and technologies and empower the team.” — PETER CREGGER, CHIEF DATA OFFICER, FNI

By 2019, 90 percent of large companies will have a CDO role in place. (Gartner)
By 2020, 50% of leading organizations will have a CDO with similar levels of strategy influence and authority as their CIO.
To best facilitate a highly effective team under this C-level position, organizations are dedicating more money and resources. According to Gartner, 80 percent of large enterprises will have a CDO office fully implemented by 2020. Currently, the average number of employees in the office is 38, but 66 percent of organizations surveyed expect that the allocated budget for the office will grow.
Josh Parenteau, Tableau’s Market Intelligence Director, notes that the role of the CDO is “outcome focused.” He states that “it’s not just about putting data into a data warehouse and hopefully someone uses it—they’re there to define what the use is and make sure that you’re getting value.” This outcome focus is critical, especially as it aligns with the top three objectives in Gartner’s 2016 CDO survey, which include greater customer intimacy, an increased competitive advantage, and an improvement in efficiency. These objectives are fueling companies like Wells Fargo, IBM, Aetna, and Ancestry to implement CDOs with the intent to take their data strategy to the next level, making the role of Chief Data Officer a business staple in 2018.

6 Crowdsourced Governance The Future of Data Governance is Crowdsourced
The modern business intelligence outfit has progressed from data and content lockdowns to the empowerment of business users everywhere to use trusted, governed data for insights. And as people are learning to use data in more situations, their input on better governance models has become a monumental force within organizations.
It’s an understatement to say that self-service analytics has disrupted the world of business intelligence. The paradigm shifted to anyone having the capacity to create analytics leading to the asking and answering of critical questions across the organization. The same disruption is happening with governance. As self-service analytics expands, a funnel of valuable perspectives and information begins to inspire new and innovative ways to implement governance.
For the business user, the last responsibility they want is the security of the data. Good governance policies allow the business user to ask and answer questions, while allowing them to find the data they need, when they need it.

“Governance is as much about using the wisdom of the crowd to get the right data to the right person as it is locking down the data from the wrong person.” — ELLIE FIELDS, SR. DIRECTOR OF DEVELOPMENT, TABLEAU

45% of data citizens say that less than half of their reports have good quality data. (Collibra)
61% of C/V Suite leaders say their own companies’ decision-making is only somewhat or rarely data driven. (PwC)
BI and analytics strategies will embrace the modern governance model: IT departments and data engineers will curate and prepare trusted data sources, and as self-service is mainstreamed, end users will have the freedom to explore data that is trusted and secure. Top-down processes that only address IT control will be discarded in favor of a collaborative development process combining the talents of IT and end users. Together, they will identify the data that is most important to govern and create rules and processes that maximize the business value of analytics without compromising security.

7 Data Insurance Vulnerability Leads to a Rise in Data Insurance
For many companies, data is a critical business asset. But how do you measure the value of that data? And what happens when that data is lost or stolen? As we have seen with recent high profile data breaches, a threat to a company’s data can be crippling and potentially cause irreparable damage to the brand.
According to a 2017 study by the Ponemon Institute, the average total cost of a data breach was estimated at $3.62 million.
But are companies doing everything they can to protect and insure their data? One industry rapidly growing in response to data breaches is the cybersecurity insurance market. This industry has seen 30 percent year-over-year growth, with the industry set to reach $5.6bn in annual gross written premium by 2020. (AON)
Cyber & privacy insurance covers a business’ liability for a data breach in which the customer’s personal information is exposed or stolen by a hacker.
However, even with the market’s growth and the continued threat of data breaches, only 15 percent of U.S. companies have an insurance policy that covers data breaches and cybersecurity. Furthermore, when you look at those 15 percent of U.S. companies covered, a majority come from large, established financial institutions.
The need for policies with financial institutions is clear. But the trend will broaden to other verticals because nobody is immune to the threat of a data breach.
Doug Laney, Gartner Analyst, recently wrote a book titled, “Infonomics: How to Monetize, Manage, and Measure Information for Competitive Advantage.” He gives distinct models on how companies across all industries can review the value of their data, both in non-financial models and financial models.

“You have to decide where the pain point is. What is the real risk to your business?” — PETER CREGGER, CHIEF DATA OFFICER, FNI

The average total cost of a data breach was estimated at $3.62 million. (Ponemon)
Only 15% of US companies have an insurance policy specifically for their data. (Ponemon)
Non-financial models focus on the intrinsic value, the business value, and the performance value of the data. These values can measure a company’s uniqueness, accuracy, relevancy, internal efficiencies and overall impact on its usage.
Financial models focus on the cost value, the economic value, and the market value of the data. These values can measure the cost of acquiring data, administering the data internally, and the value of selling or licensing your data.
Data as a commodity means its value will only increase, and ultimately drive new questions and conversations around how this raw material will continue to project companies to greater heights and advantages. And like any product, what good is it if it can be pilfered without consequence?

8 Data Engineer Role Increased prominence of the data engineer role
Here is a certainty: you can’t create a dashboard without having all of your charts built out so you can understand the story you’re trying to communicate. Another principle you likely know: you can’t have a reliable data source without first understanding the type of data that goes into a system and how to get it out.
Data engineers will continue to be an integral part of an organization’s movement to use data to make better decisions about their business. Between 2013 and 2015, the number of data engineers more than doubled. And as of October 2017, there were over 2,500 open positions with “data engineer” in the title on LinkedIn, indicating the growing and continued demand for this specialty.

“Data engineers play a fundamental part in enabling self-service for the modern analytics platform.” — FRANCOIS AJENSTAT, CHIEF PRODUCT OFFICER, TABLEAU

So what is this role and why is it so important? The data engineer is responsible for designing, building, and managing a business’s operational and analytics databases. In other words, they are responsible for extracting data from the foundational systems of the business in a way that can be used and leveraged to make insights and decisions. As the rate of data and storage capacity increases, someone with deep technical knowledge of the different systems, architecture, and the ability to understand what the business wants or needs starts to become ever more crucial.
Yet, the data engineer role requires a unique skillset. They need to understand the backend, what’s in the data, and how it can serve the business user. The data engineer also needs to develop technical solutions to make the data is usable.
A 2016 Gartner study found respondent organizations were losing an average of $9.7M annually as a result of poor data quality.
Data scientists and analysts can spend as much as 80% of their time cleaning and preparing data. (TechRepublic)
In the words of Michael Ashe, Senior Recruiter for Tableau, “I’m no spring chicken. I’ve been in technical recruiting for over 17 years. And it’s no surprise that data and storage capacity has continued to grow—I’ve seen it happen in quantum leaps. The data will always need tweaking. Businesses need to plug into this role. They need to dive into specific data to make business decisions. The data engineer most definitely will continue to grow as a role.

9 Location of things The Location of Things will Drive IoT Innovation
It’s an understatement to say that the proliferation of the internet of things (IoT) has driven monumental growth in the number of connected devices we see in the world. All of these devices interact with each other and capture data that is making a more connected experience. In fact, Gartner predicts that by 2020 the number of IoT devices available to consumers will more than double “with 20.4 billion IoT devices online.”
Even with this growth, the use cases and implementation of IoT data hasn’t followed the same desirable path. Companies have concerns about security, but most don’t have the right organizational skill sets or the internal technical infrastructure with other applications and platforms to support IoT data.
One positive trend we are seeing is the usage and benefits of leveraging location-based data with IoT devices. This subcategory, termed “location of things,” provides IoT devices with sensing and communicates their geographic position. By knowing where an IoT device is located, it allows us to add context, better understand what is happening and what we predict will happen in a specific location.
For companies and organizations seeking to capture this data collection, we are seeing different technologies being used. For example, hospitals, stores, and hotels have begun to use Bluetooth Low Energy (BLE) technology for indoor location services, which were typically difficult for GPS to provide contextual location. The technology can be used to track specific assets, people and even interact with mobile devices like smartwatches, badges or tags in order to provide personalized experiences.

“When most people think location or geospatial, they think of it as a dimension. It’s something I’m going to analyze… the new trend is that it is becoming an input into the analytical process.” — JOSH PARENTEAU, MARKET INTELLIGENCE DIRECTOR, TABLEAU

IoT endpoints will grow to 30 Billion by 2020. (IDC)
Explosive growth of IoT is expected, exceeding more than $5 billion by yearend 2020. (Gartner)
As it relates to analyzing the data, location-based figures can be viewed as an input versus an output of results. If the data is available, analysts can incorporate this information with their analysis to better understand what is happening, where it is happening, and what they should expect to happen in a contextual area.

10 Academics Investment Universities Double Down on Data Science & Analytics Programs
North Carolina State University is home to the first Master of Science Analytics program. The MSA is housed within their Institute of Advanced Analytics (IAA), a data hub with the mission to “produce the world’s finest analytics practitioners—individuals who have mastered complex methods and tools for large-scale data modeling [and] who have a passion for solving challenging problems…” As the first of its type, the NC State program has foreshadowed academia’s pronounced investment in data science and analytics curriculum.
Earlier this year, the University of California, San Diego launched a first for their institution—an undergraduate major and minor in data science. They didn’t stop there. The university also made plans, supercharged by an alumnus donation, to create a data science institute. Following suit, UC Berkeley, UC Davis, and UC Santa Cruz have all increased their data science and analytics options for students, with demand exceeding expectations. But why?

 

“I’m constantly surprised by what the students come up with, and blown away with how they’re able to just intuitively look at the data and play with the data and come up with some visualizations.” — ROBYN RASHKE, PROFESSOR, UNIVERSITY OF NEVADA – LAS VEGAS

According to a recent PwC study, 69 percent of employers by the year 2021 will demand data science and analytics skills from job candidates. In 2017, Glassdoor also reported that “data science,” for the second consecutive year, was a “top job.” As demand from employers grows, the urgency to fill a funnel of highly-skilled data fiends becomes more critical. But there’s a reality gap. The same PwC report cites that only 23 percent of college graduates will have the necessary skills to compete at the level employers demand. A recent MIT survey found that 40 percent of managers are having trouble hiring analytical talent.

 

10 Things Managers Need to Know About Digital Strategy

Marc Andreessen famously proclaimed that software is eating the world and that the best software companies will win in every sector. The arrival of software and digitalization strategies are impacting every aspect of business and changing the rules of the game in every front. In its pure form, we have the Digital Titans – Alphabet, Amazon, Apple, IBM, Tencent, Baidu, and Alibaba. These titans operate solely in the digital domain and have become a threat to incumbent firms everywhere. Incumbents have to respond by conceiving their own digital strategy.

“Incumbent firms need a digital strategy to respond to the threat posed by Digital Titans”

Take the case of Uber, an upcoming Digital Titan. Its founder, Travis Kalanick, came up with an alternative to the taxi service – he wondered about the many residents with cars who could make a quick buck by giving him a ride back to his hotel. His core idea emerged right then and Uber was born. His plan was to make Uber be like a taxi service to passengers and synonymous to a referral service for drivers. Since every ride seeker would have either an Android, iOS, or Windows phone, an app could connect riders with drivers using their phone’s GPS capabilities.

This idea evolved to include user reviews and driver ratings that took care of the quality of service, and predictive algorithms removed the question of when the ride will actually arrive. In addition, Uber also processes all payments involved, charging the passenger’s credit card, taking a cut for itself (which ranges from 5% to 20%), and depositing the remaining money into the driver’s account, all in the background and completely cashless. This simple idea of matching excess capacity to demand is now being valued over $60B dollars. Uber is the quintessential example of a company that has mastered a particular domain (ride sharing) without investing in physical assets. The age of data and algorithms has announced its arrival to the general manager. What lessons can managers take out of these case studies to compete with Digital Titans like Alphabet?

A manager today has to understand the following 10 strategic aspects of digital transformation:

1.    Collect Data to Understand Your Customers

Digital Titans have shifted our attention from the supply chain aspects to the consumption side. Collecting behavioral data from customers is becoming commonplace. Uber gets information about the customer’s credit card, preferred destinations, and locations – even when the customer is not using the Uber app. Ford has the ability to collect information about the driver of the car and their preferences. This allows Ford and Uber to partner with others and create products and services that are of immense value to their existing customer base. For example, Uber has entered the business of transporting patients between hospitals or to their homes using their superior scheduling algorithm. The same service can be used to deliver groceries. It is because of the many possible experiences Uber can deliver that it is given such a high valuation by investors.

2.    Leverage the Virtual Product

With the arrival of IoT and sensors, it is now possible to collect information about a product and create a digital version of it. This digital avatar of the product can be manipulated easily to test changes and to create new services and features. Furthermore, companies can use simulation and statistical models to understand what the digital customer prefers and then decide whether to offer those products or services. When Boeing designs new aircraft models, they create a digital version of it and use it to source parts and ensure that the integration with suppliers meets high tolerance standards.

3.    Redefine Product Upgrades

Just as digital products are constantly updated and modified, products like a car can be made to behave like a digital product. Unlike software upgrades, automobiles get upgraded every four to six years. When a Ford car has the ability to add apps to its interface, each app can provide the driver with a different user experience. With software and apps, car companies can upgrade cars like software companies upgrade software. For example, Tesla is able to download software into the car and change its functioning overnight. This same principle can be applied to modify any networked product that includes software driving its behavior.

4.    Use Analytics and Experimentation

Digital Titan Alphabet is known for its obsession with data and using analytics to make product decisions. Every feature that it releases is tested with its user base, and the test results dictate if the feature is added to the company’s offering or discarded. In fact, entire product decisions are based on usage data. As another example, Ford ran 25 mobility experiments to decide what new services it must add to its portfolio. These experiments were run on a global basis and resulted in several new products and services that were added to the car.

5.    Treat Data as a Product

Digital is largely about data. Every company has information about its products, customers, and the environment. The data they collect should be treated as an asset. It should have high quality, security, and access rights. Once they have achieved these basic qualities, companies can provide interfaces (i.e., APIs) for third parties to use their data. These interfaces can also help companies infer the value of their data assets by providing controlled access for others to experiment. Facebook’s social graph, for example, is a data asset that Facebook uses to provide superior advertising services to its customer base.

“Companies should treat data as an asset”

6.    Understand Algorithms as Codified Know-How

Autonomous vehicles, manufacturing robots, recommendation services, and digital assistants are driven by algorithms – encoded rules of operation guiding how customers interact with products and services. In the era of digital customer experiences, algorithms are strategic organizational assets accumulating organizational learning, perfected through experimentation. Digital Titans Amazon and Alphabet place great effort into improving and protecting their algorithms, as they harbor the essence of the digital experiences these companies offer to their customers. Furthermore, algorithms encapsulate decision biases and need robust governance systems. Managers in software driven companies need to understand the algorithms hidden in their workflow and systematically analyze them to eliminate bias.

“Algorithms are strategic organizational assets accumulating organizational learning, perfected through experimentation”

7.     Be a Digital Innovator

Digital Innovators use the modern infrastructure (frequently provided by the Digital Titans) as interlocking building blocks to conceive new and interesting products and services. They use the same infrastructure to perform low-cost experimentation and improve the velocity of products to markets. These digital natives know how to use their networks for finding resources, mentors, and partners. A manager who trains to become a digital innovator is only limited by their imagination. A manager who hires digital innovators and trains employees to innovate with digital technologies prepares the organization for a digital transformation.

8.    Strive for Platform Business Models

A platform company has a product that performs a vital function that is of use to many in the ecosystem. They make it easy for third parties to write applications that run on top of the platform. Finally, they have a vibrant ecosystem that allows developers to thrive and earn credentials. A classic example of this is Apple and its app marketplace. Developers learn Swift and are able to participate in writing apps for third-parties that look for applications that run on iOS. As more apps are written, the value of each app goes up along with that of the platform. In addition, developers specialized in the platform are sought-after by companies. This virtuous cycle allows companies with a platform business model to dominate in many markets.

9.    Think Ecosystems, Not Competitors

The rules of digital competition are different. Companies compete as part of a team in an ecosystem, and partnerships are critical to ensuring competitiveness and securing success. A software company like Workday develops software using tools provided by vendors like Microsoft and uses Amazon to host its software in the cloud. It also relies on hardware manufactured by Apple or Lenovo and uses it to display reports generated by its software. This is the general logic by which every company will operate going forward. The ecosystem model creates dependencies and unique positions for companies to leverage. Similarly, changes in dependencies cause shifts in the power structure within ecosystems.

10. Create a “Digital You”

Millennials joining the workforce today are expected to have 20 jobs during their careers. In addition, the skills required by each job may vary and they are expected to adapt. The future workforce needs an online presence and a strong reputation on sites like LinkedIn, Quora, and StackOverflow to present credentials to recruiters. Companies need to be ready to use these networks and accept the new forms of credentialing. In addition, recruiting today is more of a pull than a push strategy. Techniques like inbound marketing, pioneered by Hubspot, are key for both the workforce and companies looking for talent; creating a strong online presence is critical for all the agents in a digital ecosystem.

Top 25 Must-Read Cloud Computing Blogs

Whether you use AWS or Google Cloud Platform, a hybrid environment or private cloud, you can find the perfect blog in our list of the Top 25 Cloud Computing Blogs.

Cloud technology is evolving at rapid-fire speed, as evidenced by the investments behind the industry. Worldwide spending on cloud services is projected to grow from $70B in 2015 to an estimated $141B in 2019.

With all this skyrocketing growth, how do you stay up-to-date on the latest, most important cloud news? We scoured the web for the best cloud computing blogs out there, from the top industry experts to lesser-known but equally valuable voices.

Download our free, annual cloud migration survey report & discover the migration plans of 256 companies across the globe.

Whether you’re looking for practical tips to get the most out of the cloud, curious about a particular subcategory, or interested in joining a community of millions of cloud consumers, these top 25 blogs have you covered. (One even comes with laugh-out-loud cloud computing comics!) Read on to discover which blogs you should be following.

1. Infoworld — Cloud Computing

Infoworld’s outstanding cloud computing blog is written by David Linthicum, a consultant at Cloud Technology Partners and a sought-after industry expert and thought leader. His Infoworld blog is exclusively devoted to cloud computing and updated frequently. David’s recent blog ‘Featuritis’ Could Lead You to the Wrong Cloudrecommends that enterprises concentrate on strategy and not features when making cloud service choices.

2. All Things Distributed

All Things Distributed is written by the world-famous Amazon CTO Werner Vogels. His blog is a must-read for anyone who uses AWS. He publishes sophisticated posts about specific AWS services and keeps his readers up-to-date on the latest AWS news. Recent blog posts include: Accelerating Data: Faster and More Scalable ElastiCache for Redis and New Ways to Discover and Use Alexa Skills.

3. Cloud Tech

CloudTech is a leading blog and news site that is dedicated to cloud computing strategy and technology. With authors including IBM’s Sebastian Krause, Cloudonomics author Joe Weinman, and Ian Moyse from the Cloud Industry Forum, CloudTech has hundreds of blogs about numerous cloud-related topics and reaches over 320,000 cloud computing professionals. A recent post How the Financial Services Industry Is Slowly Waking Up to Cloud Computing by Rahul Singh of HCL Technologies provides an interesting analysis of how banks can overcome the barriers to cloud migration.

4. THINKstrategies

The innovative THINKstrategies blog is written by Jeff Kaplan, who is its Managing Director. A leading expert in cloud computing, Jeff is a frequent guest blogger and keynote speaker on SaaS, managed services, and IOT. In a recent post, Deconstructing the Software Business, Jeff analyzes the effect of cloud computing and SaaS on major industry players.

5. Diversity Limited

The Diversity Limited blog is written by Ben Kepes, who is a technology evangelist, investor, commentator, and business advisor. The blog provides great analysis of all the latest cloud-related tech news and commentary on newly released business software. Despite being a one-man show, the blog includes hundreds of articles and is sometimes updated a few times a day. Recent posts include: Oracle Delivers Document Signing — a Big Win for HelloSign and The Worst-Kept Secret Ever: Facebook Launches Workplace, SADA a Launch Partner.

6. Compare the Cloud

Compare the Cloud is one of the most active and extensive cloud blogs available. Its posts are from numerous writers from across the cloud industry. The blog itself reaches more than 12 million cloud technology consumers. One of the blog’s latest posts Cloud Computing: A Revolutionary Concept or Overhyped Trend?discusses the origins of cloud computing, and was written by Dan Radak, an expert in web hosting security.

7. Cloud Chronicles

Cloud Chronicles, which appears on IDG’s Network World blog section, is written by Brandon Butler, who is a Senior Editor. Cloud Chronicles discusses major technological developments, deployments, and innovations related to the cloud. In an especially interesting post, Which Is Cheaper: Containers or Virtual Machines?, Brandon discusses how application containers fit into the enterprise technology landscape and compares them to virtual machines.

8. Cloud Pundit

Cloud Pundit is the blog of Lydia Leong, who covers cloud computing for Gartner as a Distinguished Analyst. This in-depth blog often discusses the findings and opinions that Lydia generates from her professional analyses, and is updated approximately once a month. Oracle’s Next-Gen Cloud IaaS Offering is one of Lydia’s latest posts and discusses Oracle’s highly anticipated Infrastructure as a Service.

9. Cloudscaling

The highly insightful Cloudscaling blog is written by Randy Bias, who is the VP Technology at Dell EMC and a Director at the OpenStack Foundation. The Cloudscaling blog publishes a new post approximately once per month, and deals with various aspects of cloud computing, from technical details to industry events. In a fascinating recent entry, The History of Pets vs. Cattle and How to Use the Analogy Properly, Randy explains how he became the instigator of the pets vs. cattle cloud meme and aims to set the record straight about its history and proper usage.

10. Cloud Musings

Kevin Jackson is a renowned thought leader, speaker, and consultant for cloud-related technologies and business strategies. Kevin’s blog discusses issues related to cloud computing and cyber security, and includes news on product launches, acquisition, and conferences. In a recent post, #KnowYourData: The Key to Business, Kevin writes about the Trusted Analytics Platform (TAP), an open source project that Intel developed to make it easier and less expensive for developers to deploy custom analytics solutions in the cloud.

11. Rickscloud

Rickscloud is one of the Internet’s most popular sources for cloud industry information. It is written by Rick Blaisdell, who is the CTO of Motus LLC, which deals in cloud computing integration, information systems, and IT services. He updates his blog every few days and covers numerous cloud related topics such as adoption, best practices, security, and trends. Recent posts include: From Cloud Computing to Fog Computing and Running Wild – ERP Systems in the Cloud!

12. AWS Blog – Jeff Barr

There are actually numerous Amazon Web Services blogs according to region and technology, but the main blog is run by Jeff Barr, who is the Chief Evangelist for AWS. Jeff is an amazingly prolific writer, and his blog has hundreds of articles. Jeff’s posts include everything from industry news and events to technical descriptions on how to make the most of AWS. Whether you are an AWS novice or ninja, you should definitely follow Jeff’s blog. Recent posts include: New AWS Quick Starts for Atlassian JIRA Software and Bitbucket Data Center and IPv6 Support Update – CloudFront, WAF, and S3 Transfer Acceleration.

13. CloudTweaks

CloudTweaks is one of the most informative cloud blogs on the Internet. In addition to numerous contributors and a lengthy list of posts, CloudTweaks has various other forms of content, such as statistics, infographics, event announcements, and – you guessed it –cloud comics! If you don’t click on any other links in this post, I strongly recommend you at least check out CloudTweaks’s technology comics, which are both funny and insightful.

14. Cloudcast

The Cloudcast blog is quite unique in the cloud blogosphere because it consists of audio blogs instead of written posts. The podcasts are created by Brian Gracely, a technology expert at RedHat Openshift, a next generation cloud application program, and Aaron Delp, the Director of Solutions at NetApp SolidFire, which creates flash arrays for data centers. One of Cloudcast’s latest podcasts Multi-Cloud Serverless Platform addresses the need for multi-cloud platforms.

15. Microsoft Azure Blog

The Microsoft Azure blog has posts by numerous Azure staffers who are part of the company’s integrated cloud services initiative. This is a highly extensive blog, and contains over 2,500 posts covering product news and features, as well as industry events. Latest posts include Cloud Innovations Empowering IT for Business Transformation and New Security, Performance and ISV Solutions Build on Azure HDInsight’s Leadership to Make Hadoop Enterprise-Ready for the Cloud.

16. Google Cloud Platform Blog

Google Cloud Platform’s blog contains hundreds of articles written by Google cloud experts, and actually dates back to 2008. This vast blog discusses the products, customers, and technical features of Google’s cloud solution, while articles can range from product blurbs to extremely detailed technical explanations. A recent post, Evaluating Cloud SQL Second Generation for Your Mobile Game, describes how Google’s structured query language can be applied to the special needs of game development.

17. Cloud Cruiser

Cloud Cruiser is a great blog, especially for those who are interested in the financial side of hybrid IT. Sometimes corporate blogs are thin on content and heavy on promotion, but Cloud Cruiser’s blog contains truly valuable insights on how to get the most out of your hybrid cloud ecosystem. Recent posts include Cloud Smart in 6 Steps – Don’t Get Out(cloud)smarted and 5 Tips to Minimize Risk and Optimize AWS Reserved Instances.

18. Thoughts on Cloud

IBM’s Thoughts on Cloud blog provides “insights, news, and analysis for the cloud community.” Each blog post is conveniently labeled according to subcategory, for example, Big Data, Hybrid, and Infrastructure. The blog itself contains approximately 1,700 posts, authored by various cloud technology experts at IBM. Recent posts include: Why Businesses Shouldn’t Settle on a Storage Solution and 3 Ways to Avoid Failure in Application Deployment.

19. Cloud Source

The Cloud Source blog is authored by Christian Verstraete, a Chief Technologist for Cloud at Hewlett Packard Enterprise. The Cloud Source blog contains dozens of insightful posts that look at cloud computing challenges, practical approaches, and realistic solutions. In Batch to the Future – Part 1, Christian discusses the potential of cloud applications to go beyond the current conception of them as merely a way to run micro services.

20. Hyperscale Cloud

Hyperscale Cloud is Ericsson’s blog for the cloud industry. The blog covers numerous cloud computing topics and involves more than 50 contributors, and so is a comprehensive source of frequently-updated information. Two especially insightful posts recently published on this blog are Containers and Removing Barriers to their Adoption and Cloud Lock-In May Not Be What You Think.

21. Paul Miller’s Blog

Paul Miller is a Senior Analyst Serving CIOs at Forrester. His research focus is on the opportunity for digital transformation enabled by cloud-based approaches. Paul’s blog covers various areas of cloud computing, including cloud brokers, cloud storage, machine learning, and open source. His blog comprises synopses of valuable reports and briefs that he has written for Forrester. If you are a CIO or just want to get high-level, well-researched cloud analysis, you should definitely follow Paul Miller. Recent posts include Bespoke Vertical Clouds Become Less Important as Public Clouds Do More and Streaming Data From the Internet of Things Will Be the Big Data World’s Bigger Second Act.

22. Hurwitz & Associates Blog

The blog of the Hurwitz & Associates technology consulting firm is written by a few senior experts at the company, including CEO Judith Hurwitz and Vice President Jean Bozman. The blog is updated a few times each month and focuses on cloud computing, but also contains articles on big data and other topics. Judith recently wrote about IBM’s journey to the cloud in Turning Its Strategy on Its Head: Why IBM Is Leading With Cloud Services while Jean published a piece about data gravity in Data Gravity: Move the Apps to the Data – Not the Other Way ‘Round.

23. Talkin’ Cloud

Some of you may have heard of the Talkin’ Cloud blog as a result of its annual list of top 100 cloud service providers: Talkin’ Cloud 100. What you may not know is that Talkin’ Cloud is also a great resource for news, blogs, slide shares, webcasts, and special reports from across the Internet. If you like what you see when you check out this blog, it’s worth signing up for its weekly e-newsletter. Recent posts include 7 Must-Knows About Disaster Recovery and the Cloud and The 50 Most Influential Women in IoT.

24. BriefingsDirect

If you are looking for in-depth analysis, interviews, and case studies related to enterprise IT transformation, this is the blog (and podcast) for you. In his well-respected blog, Dana Gardner delves into the meatiest topics of cloud computing with hands-on experts from across the industry. In a recent online discussion Dana explored How Propelling Instant Results to the Excel Edge Democratizes Advanced Analytics with HTI Labs CEO Darren Harris and CTO Jonathan Glass. Other recent posts worth checking out include How Always-Available Data Forms the Digital Lifeblood for a University Medical Center and Feedback Loops: The Confluence of DevOps and Big Data.

25. CloudEndure

Last but not least, we’d like to include our own CloudEndure blog. If you’re interested in cloud-based disaster recovery or cloud migration, this is a great resource for best practices, latest trends, events, books, and more. Every quarter, CloudEndure publishes a comprehensive downtime report of the 100K most-visited websites as well as a list of the Top 9 Outages That Made Headlines. A recent post 5 Experts Predict Cloud Computing Trends for 2017 garnered over 700 shares on LinkedIn – definitely worth checking it out!

Moving to the cloud? Don’t make these 3 big mistakes

Other enterprises have made big mistakes in their cloud deployments. Here they are, so you can avoid making the same cloud-migration mistakes.

Most of the success with your cloud deployments comes from avoiding the errors that have cratered several cloud projects. I don’t want those of you reading this blog to repeat the same mistakes.

Here are three common mistakes in cloud deployments that you can, and should, avoid making.

Cloud mistake No. 1: Chasing the new shiny objects

No matter if it’s serverless or containers, enterprises love what is new and hip. Although both serverless cloud computing and containers like Docker have a great deal of value, I often see them used in the wrong places for the wrong use cases.

Any new technology that’s drawing the attention of the tech press—such as machine learning, deep learning, containers, internet of things, and serverless computing—needs to have a good business case which connects to a good use case. Without both a good business case and good use case, you could be forcing square pegs into round holes—no matter how cool it is.

Cloud mistake No. 2: Not considering devops

Moving to cloud? Then you should move to a devops way of life, and to devops-enabling technology, as soon as you can. Why? Because it will make your cloud computing migration more agile and faster.

The fact of the matter is that many enterprises moving to cloud reduce the value they get from cloud because their existing application development processes are still traditional waterfall. Although the waterfall approach works, it does not provide the ability to continuously improve application workloads or speed the production and deployment of new applications.

Cloud mistake No. 3: Hiring for budget, not for talent

No matter if you’re using a consulting service or hiring directly, if you’re going cheap on cloud computing talent, you’ll get resources that will likely do more harm than good. Saving $1 million in salary can cost you $100 million in avoidable screw-ups. I see this every day.

Phishing Scams Even Fool Tech Nerds—Here’s How to Avoid Them

YOU KNOW NOT to click on links in sketchy emails. Everybody knows that. And yet, people fall for these phishing attacks all the time. Case in point: The FBI suspects a phishing email is how the Russian hackers who were indicted this week got into Yahoo. Ditto for the breach of the Democratic National Committee, and the Sony Pictures hack. In fact, there’s currently a Gmail phishing scam going around that even super savvy techies are falling for.

Click here to view detail

VIETNAM ICT SECTOR BRIEFING 2016

In the International Telecommunication Unîon, Global ICT Development Index (IDI) 2015. Vietnam was ranked 102nd(1) from 167 countries. This placed the country at the 6th(2) position in ASEAN and the 17th(3) in Asia-Pacific.

The Global Information Technology Report 2015 by World Economic Forum showed that Vietnam’s Network Readiness Index (NRI) ranked 85th(4) among 143 countries. Vietnam ranked 2nd(5) worldwide in terms of network affordability which consists of three variables (mobile cellular tariffs, fixed broadband Internet tariffs and the Internet and telephony sectors competition index). The index assesses the cost of accessing ICT, either via mobile telephony or fixed broadband internet, as well as the level of competition in the internet and telephony sectors.

According to Cushman & Wakefield’s Business Process Outsourcing and Shared Service Location Index 2015, one sector that has witnessed remarkable levels of growth has been the IT software industry. Vietnam is now home to over 1,000 software companies(6) that employ over 80,000 people(7). This makes Vietnam one of the world’s largest software exporters and the second largest software outsourcer in Japan (8). Rankings of Ho Chi Minh city and Hanoi in Tholons 2016 Top 100 Outsourcing Destinations are 18th(9) and 19th(10) respectively.

BBGV_Business_Centre_-_Vietnam_ICT_Sector_Briefing_2016_2016.11.29

Report: Southeast Asia’s internet economy to grow to $200B by 2025

There isn’t a lot of data on the potential of technology in Southeast Asia, which is why many in the industry have been excitedly flipping through a new report released today by Singaporean sovereign wealth fund Temasek and Google. The biggest takeaway: Southeast Asia’s internet economy could be surge to be worth a massive $200 billion annually within ten years.

Report_ Southeast Asia’s internet economy to grow to $200B by 2025

How the Trans-Pacific Partnership benefits Vietnam’s economy

Vietnam’s membership in the Trans-Pacific Partnership (TPP) will yield economic benefits, especially to the country’s manufacturing sector. Vietnam’s textiles and apparel industry will enjoy expanded access to the US and Japan markets through reduced tariff duties as a result of TPP once it has been enacted, accelerating foreign direct investment into the country.
However, as highlighted in a white paper by Solidiance, an Asia-focused management consulting firm, strategic development of supporting industries (raw materials & machinery) and accompanying infrastructure (port, construction & logistic) will be needed to fully absorb TPP’s benefits for the economy.
Drivers behind Vietnam’s benefits from TPP
Vietnam’s manufacturing environment is well-positioned to benefit from TPP’s passage due to three primary factors:
1. Large trade volumes with the US and Japan
2. Competitive manufacturing environment
3. Tariff cuts of key export and import products
As TPP signatory countries account for around 40 percent of Vietnam’s total exports, the TPP’s passage will not only accelerate Vietnam’s exports to TPP member countries, but also increase the country’s total export by an additional USD 68 billion by 2025.

In addition to Vietnam’s well performing competitive manufacturing environment, export-oriented manufacturers will be drawn to Vietnam as a result of TPP. This will further enhance the country’s attractiveness, especially in the textile &apparel supporting industries where manufacturing facilities had already be set up in Vietnam prior to the signing of TPP in anticipation of the agreement. TPP’s yarn forward regulations requires Vietnam to take full advantage of reduced tariffs, textile & apparel inputs need to be sourced in a TPP member country.

Potential impacts of TPP on Vietnam’s manufacturing
Once TPP goes into effect, current tariff rates for textile, garments, and apparel exports from Vietnam to the US (7.9 percent on average for textiles and 11.4 percent for clothing) will be gradually reduced to zero, allowing for expanded market access to the US and Japan for Vietnam-based companies.
As an advantage of this expanded market access, TPP will attract additional Foreign Direct Investment (FDI) to Vietnam and drive further investments in an increasingly competitive business environment, with FDI expected to reach around US$20 billion by 2020 as more standardized business and policy environments are part of the requirements of TPP.
When large domestic and foreign investments pour into Vietnam’s economy following major trade agreements, manufacturing facilities tend to scale-up to take advantage of economies of scale. This has the potential to benefit Vietnam’s manufacturing sector and lead to an increase in production scale and industrial deepening, which ultimately drives productivity growth.
Moreover, rising FDI will fuel the development of upstream suppliers and manufacturers in supporting industries following TPP’s implementation. To illustrate, in recent years, large electronics manufacturers have expanded their production base in Vietnam, creating potential market opportunities for local parts and component suppliers.

As the agreement is being implemented, key export manufacturing industries, like textile & garment, footwear and fishery, among others, will enjoy rapid outsized growth.
In 2015, disbursed FDI reached a record high at $14 billion, at least in part attributable to anticipation of TPP. At present, more than $1 billion of investments in garment & textile supporting industries has already been instilled in Vietnam‘s growing economy, with investors from China, Taiwan, Japan, South Korea, and India setting up specialized industrial zones for garment & textile material production, yarn factories, packaging facilities, and more.
Vietnam is one of Asia’s fastest growing economies with promising business opportunities to follow. While Vietnam’s manufacturing industry has already made impressive strides, TPP‘s impact will further boost Vietnam‘s manufacturing growth. Leading industry players need to take note of this momentum and define key strategies to capitalize on Vietnam‘s burgoening manufacturing sector.
Source: http://www.thanhniennews.com/commentaries/how-the-transpacific-partnership-benefits-vietnams-economy-60650.html

 

Vietnam: Pushing the tempo (H1/2015)

The year began on a positive note for Vietnam. Economic activity was strong in Q1, setting the stage for acceleration in annual GDP growth. The economy is benefitting from a recovery in domestic demand and foreign direct investment (FDI) inflows that continue to support investments and exports. Besides, low global oil prices have pushed consumer price pressures lower. With inflation below the central bank’s target, monetary policy is likely to remain accommodative. The economy will also benefit from reforms aimed at improving the health and efficiency of banks and state-owned enterprises. Policymakers are also trying to deepen economic engagement with key trading partners, a move that is likely to reap dividends in the medium to long term.

Vietnam’s economy has been quick off the blocks in 2015, growing 6.0% year-over-year in Q1 2015.

This is the fastest pace of growth for the first quarter in the last five years. Usually GDP growth in Vietnam accelerates through the year. Consequently, strong growth in Q1 2015 augurs well for the coming quarters. In Q1, industry and construction had the largest impact on GDP growth, contributing 2.8 percentage points. Services and agriculture contributed 2.4 and 0.3 percentage points, respectively.³