Toggle Menu

Insights / Artificial Intelligence (AI) / 7 Trends in Analytics for 2017

January 06, 2017

7 Trends in Analytics for 2017

3 mins read

What’s new and exciting in the world of analytics for 2017? Here’s our list of 7 trends we expect in 2017 based upon our work with organizations in the DC Area:

1. Adopting a Team-approach to Data Science

We’ve been pairing a stats and machine learning expert with a data integration engineer (to enable ongoing automation and data quality) plus a data visualization master (to create an impressive end display). It’s far easier to recruit depth of skills and you can stagger when different skills are needed on the project to optimize resource use. A preferred staffing approach – unless you enjoy the thrill of the hunt for the Data Science Unicorn!

2. Data Lake Builds

To better support data exploration and discovery. Data remains in its raw form and has low latency, some organization is needed to make the Lake readily searchable and not a ‘junk drawer’. This can be enabled with metadata tagging and simple schema structures. The key benefit of the Lake is the reduction in time-consuming manual data gathering from multiple, disparate source systems by data analysts and data scientists. Go to the Lake not the source spring for each system. The Lake can also be used to streamline data onboarding for the downstream Data Warehouse over time.

3. Enabling Data Scientist Sandboxes

As a secure and flexible environment for solo and shared analysis. Encourage collaboration on complex analyses by providing a spacious play area for individual analysis work and the ability to publish interim results into a shared area. Provide the preferred tools for your analytics squad and watch them take off!

4. Data Warehouse De-cluttering Initiatives

To scale back what is cleansed and transformed regularly and limit it to only the data that needs to be high quality for ongoing reports and dashboards. This reduces workload demand on the warehouse (and likely resolves some performance issues) by cleansing and processing only the data you know you are going to use. As business needs change, ensure you have a speedy method to add new data to the warehouse by adopting Agile and DevOps (see below).

5. Continued Migration to the Cloud

The elastic scale and cost benefits are significant when working with data at volume. The data security options are plentiful and ever improving – financial, healthcare and federal organizations are all working in the Cloud with confidence. If you’re unsure where to start Amazon Web Services is a popular choice for ease of use and has a plethora of online training options to accelerate progress.

6. Increased Agile Adoption

For analytics solutions, like Scrum or Kanban (or both), to realize the advantages of frequent user feedback to a tangible delivery every few weeks via ceremonies like Reviews and Retrospectives. The feedback drives ongoing solution priority discussions and leads to beneficial course corrections for team operations and the functional outcomes. Remember, the focus is on providing the end user something they can see and use with every iteration – if it stays with IT for months, you’re missing out on precious user inputs.

7. Adopting DevOps Automation Techniques

Such as Continuous Integration and Continuous Delivery with the rollout of analytics solutions, will advance speed of delivery and improve solution quality.  Curious to learn more? Read 5 Key Aspects of DevOps.

We wish you success in all your 2017 technical endeavors.

You Might Also Like

Resources

Simplifying Tech Complexities and Cultivating Tech Talent with Dustin Gaspard

Technical Program Manager, Dustin Gaspard, join host Javier Guerra, of The TechHuman Experience to discuss the transformative...

Resources

How Federal Agencies Can Deliver Better Digital Experiences Using UX and Human-Centered Design

Excella UX/UI Xpert, Thelma Van, join host John Gilroy of Federal Tech Podcast to discuss...