What’s new and exciting in the world of analytics for 2017? Here’s our list of 7 trends we expect in 2017 based upon our work with organizations in the DC Area: 1. Adopting a Team-approach to Data Science We’ve been pairing a stats and machine learning expert with a data integration engineer (to enable ongoing automation and data […]
What’s new and exciting in the world of analytics for 2017? Here’s our list of 7 trends we expect in 2017 based upon our work with organizations in the DC Area:
We’ve been pairing a stats and machine learning expert with a data integration engineer (to enable ongoing automation and data quality) plus a data visualization master (to create an impressive end display). It’s far easier to recruit depth of skills and you can stagger when different skills are needed on the project to optimize resource use. A preferred staffing approach – unless you enjoy the thrill of the hunt for the Data Science Unicorn!
To better support data exploration and discovery. Data remains in its raw form and has low latency, some organization is needed to make the Lake readily searchable and not a ‘junk drawer’. This can be enabled with metadata tagging and simple schema structures. The key benefit of the Lake is the reduction in time-consuming manual data gathering from multiple, disparate source systems by data analysts and data scientists. Go to the Lake not the source spring for each system. The Lake can also be used to streamline data onboarding for the downstream Data Warehouse over time.
As a secure and flexible environment for solo and shared analysis. Encourage collaboration on complex analyses by providing a spacious play area for individual analysis work and the ability to publish interim results into a shared area. Provide the preferred tools for your analytics squad and watch them take off!
To scale back what is cleansed and transformed regularly and limit it to only the data that needs to be high quality for ongoing reports and dashboards. This reduces workload demand on the warehouse (and likely resolves some performance issues) by cleansing and processing only the data you know you are going to use. As business needs change, ensure you have a speedy method to add new data to the warehouse by adopting Agile and DevOps (see below).
The elastic scale and cost benefits are significant when working with data at volume. The data security options are plentiful and ever improving – financial, healthcare and federal organizations are all working in the Cloud with confidence. If you’re unsure where to start Amazon Web Services is a popular choice for ease of use and has a plethora of online training options to accelerate progress.
For analytics solutions, like Scrum or Kanban (or both), to realize the advantages of frequent user feedback to a tangible delivery every few weeks via ceremonies like Reviews and Retrospectives. The feedback drives ongoing solution priority discussions and leads to beneficial course corrections for team operations and the functional outcomes. Remember, the focus is on providing the end user something they can see and use with every iteration – if it stays with IT for months, you’re missing out on precious user inputs.
Such as Continuous Integration and Continuous Delivery with the rollout of analytics solutions, will advance speed of delivery and improve solution quality. Curious to learn more? Read 5 Key Aspects of DevOps.
We wish you success in all your 2017 technical endeavors.
Getting Agile requirements right can be a challenge but it’s essential as they can make...
Scaling, like Agile itself, can become a target objective rather than the means to an...