Skip to main content

Here is some food for thought if you’re considering an advanced analytics project (eg. big data, AI, machine learning, deep learning).

In the coming weeks, we’ll have a blog post about the recent artificial intelligence day at the AWS Loft (NYC) and what Amazon is doing to democratise access to AI resources. We’ll also be announcing our involvement in a new deep learning Meetup group in Cork (Ireland) – so stay tuned!

According to Gartner

“Through 2017, 60 percent of big data projects will fail to go beyond piloting and experimentation, and will be abandoned.”

This article from Network World also highlights very similar issues.

Not great odds in anyone’s book! But why is this the case, and what can be done to boost your analytics project chances of success?

Expertise

Certainly, a major factor that influences the outcomes of these projects is the level of expertise and skills applied to the chosen business problem. AI and big data projects are so much more than just deploying Hadoop and hoping for the best. Teams involved need the right mix of business skills (asking the right questions), IT skills (choosing the right infrastructure), and data science/quantitative skills (analysing the data in the right way).

Many organisations try to build capabilities in-house, but this approach is slow and not suited to a quick win, especially important if you want to avoid Gartner’s dreaded “trough of disillusionment”:

gartners_hype_cycle

Building the skills and tools internally isn’t for everyone, and it’s often not the best way to get started.

Organisations that don’t have the skills, or don’t have enough resources with the right skills, can tap outside firms to ensure a successful outcome. Choosing the right development partner has the additional (and oft overlooked) benefit of building in-house capabilities through knowledge-transfer, exposure to industry best practices, and technical cross-pollination.

Choose the right deployment model

Big data and AI infrastructure projects can be major capital expenditures. Once deployed, businesses want to see a fast and healthy return on these dollar investments. Not necessarily an environment conducive to experimentation, learning, failing fast, and being agile and flexible – all traits that lend themselves to a DevOps friendly practice.

An approach that is far nimbler and offers shorter time-to-value is to make use of cloud-based data warehousing, data-lake, and AI solutions. These offerings are far more cost-effective (if architected correctly), and offer far greater flexibility and scalability. You also transfer the burden of managing the infrastructure onto the cloud provider, leaving you to focus on solving your business problems and not having to worry about change windows, patching schedules, and maintenance agreements.

Plus you’ll be making your financial bean-counters happy, as you are not capitalising your IT assets and depreciating them on your balance sheet. Instead, your cloud costs are shifted to your operational expenses and reflected on your income statement in the period they occur.

If you have any questions about any advanced analytics project, get in touch to share your thoughts.

Make sure to follow us on LinkedIn to stay up to date on our latest career opportunities.

fourTheorem is a non-traditional software design and development business. We help organizations from start-up, through digital transformation and platform rescue, to the adoption of advanced AI and Serverless services deployed to cloud platforms. We focus on business outcomes and up-skilling, rock-solid strategy and planning, scalable design, and cost-efficient technology and DevOps adoption.