Posts Tagged Part 1

Come on… Big Data! Big Data!

The Enterprise Data Architecture, Part 1

 

With the explosive growth of the Internet and more recently, the Internet of Things (the rise of the machines…!!!), there has been a corresponding explosive growth in the amount of data available and collected from all those connected devices and software.  However, along with the vast volume of data that has been generated, there has been an even greater challenge in dealing with the data.  As early as 2001, various IT analysts started reporting on potential issues that would arise due to the mass amounts of data being generated.  Most notably, this article by Gartner analyst, Doug Laney, summarized the various issues attached to Big Data (as it’s commonly called) into whats known as the 3Vs:

  • Data Volume – The depth/breadth of data available
  • Data Velocity – The speed at which new data is created
  • Data Variety – The increasing variety of formats of the data

Now, 16 years after that article was written, we still face many of the same challenges.  Even though technology has improved to be able to handle larger amounts of data processing and analytics, the amount of source data has continued growing at exponential rates.  Cisco releases each year a Visual Networking Index which shows previous years data as well as predictions regarding the next few years as well.  In their 2016 report (Cisco, 2016), the prediction was that by 2020, there would be over 8 billion handheld/personal mobile ready devices as well as over 3 billion M2M devices (Other connected devices such as GPS, Asset Tracking Systems, Device Sensors, Medical applications, etc.) in use, consuming a combined 30 million terabytes of mobile data traffic per month.  And that is just mobile data.

   

So what are the real challenges being faced due to this exponential growth of data?  Here are some facts to consider as posted by Waterford Technologies:

  • According to estimates, the volume of business data worldwide, across all companies, doubles every 1.2 years.
  • Poor data can cost businesses 20%–35% of their operating revenue.
  • Bad data or poor data quality costs US businesses $600 billion annually.
  • According to execs, the influx of data is putting a strain on IT infrastructure. 55 percent of respondents reporting a slowdown of IT systems and 47 percent citing data security problems, according to a global survey from Avanade.
  • In that same survey, by a small but noticeable margin, executives at small companies (fewer than 1,000 employees) are nearly 10 percent more likely to view data as a strategic differentiator than their counterparts at large enterprises.
  • Three-quarters of decision-makers (76 per cent) surveyed anticipate significant impacts in the domain of storage systems as a result of the “Big Data” phenomenon.
  • A quarter of decision-makers surveyed predict that data volumes in their companies will rise by more than 60 per cent by the end of 2014, with the average of all respondents anticipating a growth of no less than 42 per cent.
  • 40% projected growth in global data generated per year vs. 5% growth in global IT spending.

Datamation put together this list if Big Data Challenges of which I want to highlight a few a specifically.

  1. Dealing with Data Growth – As already mentioned above, the amount of data is growing year over year.  So a solution that may work today, may not work well tomorrow.  Investigating and investing in the technologies that can grow together with the data is critical.
  2. Generating Insights in a Timely Manner – Generating and collecting mass amounts of data is a specific challenge that needs to be faced.  But more importantly, what do we do with all that data?  If the data being generated is not being analysed and used to benefit the organization, then the effort is being wasted.  New tools to analyze data are being created and released annually and these need to be evaluated to see if there are organizational benefits to be gained.
  3. Validating Data – Again, if your concern is generating and collecting mass amounts of data, then just as important as processing and analyzing the data, it is important to verify the integrity of the data.  This is especially important in the quickly expanding field of medical records and health data.
  4. Securing Big Data – Additionally, the security of the data is a rapidly growing concern.  As seen in recent data hacks such as the Equifax hacking attacks, the sophistication of hacking and phishing is growing at a rate equivalent to the volume of big data itself.  If the sources of data are not able to provide adequate security measures, the integrity of the data can be questioned as well as many other issues.

Big Data is here to stay, and it’s only going to get bigger.  Is your company ready for it?

 

References:

Cisco (2016). Cisco Visual Networking Index. Retrieved from http://www.cisco.com/c/en/us/solutions/service-provider/visual-networking-index-vni/index.html

 

, ,

No Comments

Let’s get SaaSsy…!

The Enterprise Application Architecture, Part 1

SaaS.  PaaS.  IaaS.  The Cloud.

What is this gibberish?  Is it some foreign or mystical language?  What does it mean…?

First of all, let’s clear the air and make one thing absolutely clear:

In it’s simplest form, cloud computing or “The Cloud”, is just a fancy term used to describe distributed computing.  Some aspect of the system, whether it’s the storage, data, application, server, etc., is accessed remotely, typically via the internet.  Back in the 1990s, this same concept took place, but required dedicated connections either via leased lines or VPN services over existing communication channels.  The cost factor of those services was substantial enough that only larger organizations could make it practical for everyday use.   As technology became faster and cheaper, the distributed computing services gradually became more available.  In addition, with the rapid growth and adoption of the Internet starting in the mid 90s, more computing services were being made available to individual consumers.  Once service providers realized the potential revenue from this technology, it wasn’t long before entire businesses were created around the provision and sale of “cloud” computing services.  Amazon’s debut in 2006 of Elastic Compute Cloud services pushed the technology further and now there are many large scale providers of cloud services and it’s a common term both within the IT industry and to many consumers.

Building on that basic framework of cloud computing, there are various level of service that can be provided (Wikipedia – Cloud Computing):

  • SaaS – Software as a Service: The capability provided to the consumer is to use the provider’s applications running on a cloud infrastructure. The applications are accessible from various client devices through either a thin client interface, such as a web browser (e.g., web-based email), or a program interface. The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.
  • PaaS – Platform as a Service: The capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created or acquired applications created using programming languages, libraries, services, and tools supported by the provider. The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, or storage, but has control over the deployed applications and possibly configuration settings for the application-hosting environment.
  • IaaS – Infrastructure as a Service: The capability provided to the consumer is to provision processing, storage, networks, and other fundamental computing resources where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, and deployed applications; and possibly limited control of select networking components (e.g., host firewalls).
(Image Source: Wikipedia – Cloud Computing)

 

So SaaS specifically is the distribution of applications/software.  Typically, a SaaS application is accessed and delivered via a web browser.  A prime example of a SaaS application that many of us use today is Dropbox or Google Drive.  We have a service that provides access to files that are stored remotely on a server somewhere.  We are not directly accessing that remote server, but instead it’s the application (whether a desktop application or a web-page) that lets us access our files.  Google Drive is part of a larger application group that also includes email, calendar, chat, word processing, spreadsheets, presentations and forms.  Other large companies such as Microsoft and Amazon, as well as vendors such as Workday, Salesforce and Concur have all quickly joined the SaaS marketplace with solutions.

So what are the benefits of pursuing a SaaS strategy?  Here is a short list as compiled by IBM:

  1. Lower Cost – First, SaaS applications are subscription based.  There are no licensing fees and no initial costs.  This will typically translate to lower overall costs compared to the implementation of a similar traditional application.
  2. Reduced Time to Live – Secondly, there is a drastic difference in time-to-live.  Since there is no hardware or backend processes that need to be installed and configured ahead of time, time to the application being live  is shortened.  Then, in addition, the simple benefit of the application already being installed and running on the remote servers reduces the time as well.  In most cases, configuration and setup is a simple process which takes at most a couple of hours.
  3. Painless upgrades – Considering that the actual application resides on a remote server maintained by the vendor, the end user doesn’t have any action needed to perform updates to the software.  This is all managed remotely in the back end and is typically seamless to the end user.

However, it’s not all positive.   Two of the biggest issues that companies have with SaaS services are Security and Outages.  How secure is my data and business info?  What happens if hackers get into the data center where it is hosted?  What happens when the hosting location has an outage?  How will I be able to utilize the application during that time?  Will I be compensated for the outage?

Ultimately, the pros and cons have to be weighed against the specific circumstances that you envision for the SaaS services.  Each company will have different levels of needs and justifications that will determine whether it is the right solution for them or not.

 

 

, ,

No Comments

Back in the Saddle again…

Digital Disruption, Part 1

This semester (Fall 2017), once again, I will be blogging regarding my reading topics throughout the course.   The class focus this time is specifically Enterprise Architecture: Information Technology Architecture.

As I detailed roughly a year ago, Enterprise Architecture (EA) itself can simplistically be considered the intersection of Business, Strategy and Technology.  But if we look deeper into the overall concept of EA, we find that it can be easily broken down into smaller domains or layers.  As detailed in this diagram created by Niles Hewlett from the USDA Enterprise Architect team back in 2006 (courtesy of Wikipedia), the 4 layers of EA are distinct, yet exist dependent on each other.

Throughout this course, we will dive deeper into each of the layers, exploring how they interact with each other and, more importantly, how different technologies and trends impact the overall architecture and each layer individually as well.

The Fast Pace of Technology Adoption

As we all know and experience frequently, technology is changing at an ever-increasing pace.  Today’s hot trends can quickly become tomorrows old news.  In considering this challenge, we need to be cognizant that this is two separate issues that compound each other.  First, advances in technology itself are accelerating.  As expounded in his book, The Singularity is Near, author Ray Kurzweil states, “Evolution (of technology) results in a better next-generation product.  That product is thereby a more effective and capable method, and is used in developing the next stage of evolutionary progress. It’s a positive feedback loop” (Kurzweil, 2006).  In other words, we are using better & faster tools to design and create better & faster tools.  As the end product is improved, this then cuts down on the next stage of development, which in turn shortens the next development phase.

Secondly, adoption of said technology advances are accelerating as well.  From end-user consumption of modern technology to the use of technology in business, the differences that we see compared to even 10 years ago is astounding and shocking. As noted by Rita Gunther McGrath, “It took decades for the telephone to reach 50% of households, beginning before 1900.  it took five years or less for cellphones to accomplish the same penetration in 1990” (McGrath, 2013).  The below chart was assembled by the Economist from data provided in Kurweil’s book.  It shows not only the fast adoption of cellphones and the Internet, but other “new” technologies that have had a large impact on society over the past 150 years or so.

As society rapidly adopts technology, this quickly filters into the business world as well.  An organization faces an even bigger challenge to be able to integrate the new technology into it’s business practices and strategy without bringing about major disruptions to the operation of the organization.   More about that next….

 

References:

Kurzweil, Ray. (2006). The Singularity is Near: When Humans Transcend Biology. New York. Penguin Books.

McGrath, Rita Gunther. (2013). The Pace of Technology Adoption is Speeding Up. Harvard Business Review.  Retrieved August 28, 2017 from  https://hbr.org/2013/11/the-pace-of-technology-adoption-is-speeding-up

 

 

, ,

No Comments