Skip to content

Recap of the 2015 NCTA “State of Technology” Conference

Recently, I was able to attend the North Carolina Technology Association (NCTA) “State of Technology” conference. This conference brings together some of the leading companies and technologists for a look at the future of technology and North Carolina’s role in that future. The conference consisted of keynote speakers, panel discussions and breakout sessions that examined some of the leading trends of the day. During my time at the conference, I was able to sit in on the opening keynote by IBM’s SVP of Solutions Portfolio & Research and two breakout sessions on “Big Data” and “Cloud Technologies.”

NCTA Keynote: John Kelly of IBM

John Kelly, SVP of Solutions Portfolio & Research for IBM, delivered the keynote speech, a reprise from five year ago.  He opened his talk with a look back at the predictions he’d made back in 2010. At that time, he predicted things such as real-time data analysis, affordable personal genome mapping, and IBM Watson’s ability to compete on Jeopardy.  I thought this approach was a creative way to provide legitimacy to tech predictions which usually are just tossed away.

Kelly then turned his attention to his predictions for the next five years.  One data-related prediction he made stood out to me: data generation will continue to grow exponentially, and in 5 years people will be producing tens of zettabytes of info.  (A zettabyte is one trillion gigabytes.)  By that point, the ratio of unstructured to structured data will be huge.  Interpreting structured data usefully is already a big data challenge, and I believe tools to utilize this amount of unstructured data is going to be a key goal in the industry,

Kelly also imparted a potential planet-shaping insight: functional, practical quantum computers could be available in a few years. Why is this planet shaping?  Quantum computing power grows exponentially compared to current processing, which means that it could take just a short while to bypass the computing power that currently exists on the planet.  This level of power should help to resolve our big data issues, it may allow companies to process and react to information near real-time, and it will open doors to applications nobody has yet dreamt of.

“Big Data” Breakout Session

In the first breakout session of the day, panelists from LPL Financial, MCNC, Railinc, and Next Glass fielded questions on “Big Data.”  There were a few valuable discussions including roles in the life cycle of big data, data ownership, and the lack of educational paths for big data science.  To me, the biggest take away of the session came from Mark Johnson, CTO of MCNC. Johnson was discussing a conundrum he sees regarding data: The value of the data may not be apparent when it’s collected. This doesn’t mean, “there’s so much data we don’t know if it will be useful.”  Rather, it’s more that the usefulness of data may not even exist at the time data is collected.  He referenced the Navy’s policy of recording weather data, every day, on every ship and noted that if all that data could be centralized and inputted, weather companies could have detailed  global weather patterns over the last 100 years for use in modelling.  I feel safe in saying the Navy didn’t plan for this when creating that policy a century ago.  Think deeply on this, and let it spark in your mind should you be deciding what data to discard or simply not collect.

“Cloud Technology” Breakout Session

The second breakout session, “Cloud”, included experts from SAS, the City of Asheville, WorkDay, and Piedmont Natural Gas.  Topics discussed included migration to the cloud, changing skill sets for IT, the benefits of API development, and the inevitable move to the cloud. One member of the audience asked the panel a question pertinent to many businesses these days: where to start when trying to migrate? Scott Barnwell, Business & Public Technology Manager from the City of Asheville, provided a strong answer: Disaster Recovery (DR).  Scott noted that in his research of setting up DR, using the cloud was 80% cheaper, testable, and ultimately was the only feasible solution.  DR is something that is only tested or implemented on occasion, so using physical hardware for this purpose just didn’t make sense.  Scott boiled the concept down later to “the ROI tends to speak for itself.”


John Kelly and the Big Data session both helped plant seeds of how creating data related solutions will be challenging, but rewarding to early adopters.  The critical nature of moving to the cloud resonated more with me as someone who architects business solutions in the cloud, but hopefully business owners are able to see that the move is approachable and attainable.  The NCTA event ultimately served (what I believe to be) it’s purpose: exposing the business community to technology areas which are growing and can facilitate success.

*Photo credit: Griffin Hart Davis

The Atlantic BT Manifesto

The Ultimate Guide To Planning A Complex Web Project