Now more than ever, having the capacity to quickly access and evaluate data is essential for maintaining a competitive advantage. With reliable data gathering and robust infrastructure being the standard across industries, business leaders must take the next step: maximizing the valuable insights gleaned from this abundance of data. Business intelligence (BI) tools are typically the access point for stakeholders to get their reports and dashboards; however building out and maintaining a robust reporting ecosystem can require significant time and effort from analysts, engineers and BI teams.

Decision-makers have constantly evolving needs but the data required to make informed choices isn’t always readily accessible. Compounding the issue is that building ad-hoc reports demands considerable time and involves numerous back-and-forth with the BI team. This presents significant bottlenecks in time-to-insight. But what if business users could directly access and analyze their data without relying on technical expertise?

What Amazon Q in QuickSight Is

Amazon Q in QuickSight is a generative AI assistant embedded within QuickSight, the Amazon Web Services browser-based cloud Business Intelligence tool. The tool allows users to ask natural language inquiries about data and receive prompt, accurate answers accompanied by instant visualizations. Developers can also leverage natural language questions to create visuals, executive summaries, forecasts and anomaly reports. 

Amazon Q in QuickSight has the potential to significantly reduce time to insight and lighten the load on dashboarding teams. While BI analysts focus on building data sources and critical reports, business users can generate ad-hoc reports in seconds rather than hours. QuickSight operates securely within your Amazon QuickSight account, requiring only clean data sources, well-structured metadata, and well-formulated questions.

What it Is Not

Amazon Q in QuickSight is not a replacement for QuickSight or any other BI tool, nor is it a general-purpose AI virtual assistant. Unlike general-purpose AI assistants, such as Amazon Q Business, which use data from across the organization to answer broad business questions, Amazon Q in QuickSight is designed for narrow, data-specific inquiries. It works best with business-specific questions related to datasets that have been prepared for natural language queries, providing focused, data-driven answers tailored to the user’s needs

Leveraging Amazon Q in QuickSight

For Chief Data Officers

Empower your teams to explore data independently with Amazon Q in QuickSight. This tool allows everyone in the organization to ask questions and gain insights without relying on data analysts for every query. By democratizing data access, you can foster a culture where business decisions are quicker and more informed. This increased autonomy also frees up your data teams to focus on complex analyses and strategic initiatives, driving greater value across the organization.

For Business Intelligence Managers

Streamline your team’s workload by routing frequently asked data questions through Amazon Q in QuickSight. This approach significantly reduces the time spent on repetitive data requests and ad-hoc reports, allowing your team to focus on higher-value tasks. For example, you could create a Q Topic dedicated to quality assurance on dimensional data or for validating database metrics without writing SQL. By implementing Amazon Q in QuickSight, you not only boost your team’s productivity but enhance the efficiency of teams you work with as well.

For Analysts

Use Amazon Q in QuickSight to validate hypotheses and gather quick insights during exploratory analysis. Save time on running queries by asking data questions right in the Amazon Q in QuickSight interface. The tool also facilitates quickly switching between datasets, as all topics are available and can be accessed in any part of QuickSight. Amazon Q in QuickSight also supports building dashboard visuals and writing calculations directly in the Q interface, further accelerating report development.

Creating a Topic

Using Amazon Q in QuickSight is simple by design. Once enabled, the tool appears as an icon within QuickSight, which the user can click, ask a question about data and receive an answer.

To successfully implement Amazon Q in QuickSight, data topics must first be set up, which requires thoughtful planning.

Start Asking the Right Questions

Before even choosing the right dataset to answer questions, it’s important to establish a clear business case and determine what questions stakeholders need answered. This process is iterative and adaptable.

Start by engaging with your key stakeholders to understand their pain points and goals. What insights are driving business decisions? Which reports are they using frequently? Which are they rarely using? How much time is being spent on ad-hoc reporting? The insights gained from this process is invaluable, and ensures the stage is set for Amazon Q in QuickSight implementation.

Building Topics

Topics for Amazon Q in QuickSight are organized sets of data designed to give context for the AI, allowing users to ask questions and receive answers in seconds. When an end user interacts with Q, they will do so by selecting a topic within the QuickSight interface and asking questions related to that topic.

To create a topic, you must first select the datasets the topic will use, and manipulate the metadata to fit the chosen business case to make it user-friendly.  

Identify Jargon

During this process, note down any business-specific language used by stakeholders. When creating a topic, you can give data fields synonyms. This helps Amazon Q in QuickSight understand questions and correctly identify the relevant field to use in its analysis. If a stakeholder asks a question using business jargon known to them, they should still get an answer they expect. Organizations that maintain a glossary/dictionary of terms for their business can use this document to further speed up adoption of Amazon Q in QuickSight by their stakeholders.

Topics: Broad or Narrow?

Should your topic be about sales across the organization, or only one product? Will it contain sales data, inventory data, or both? These are critical decisions to make when setting up your topics in Amazon Q in QuickSight.

Topics encompassing a broad range of target data can answer diverse questions and provide a comprehensive view of the business. They are useful for high-level reporting and for users who require a general overview.

Topics that are narrower in focus might include sales data, customer accounts, key financials or other limited scope data. These focused topics offer more targeted insights, making it easier for end users to get relevant answers from Amazon Q in QuickSight quickly.

Your exact topic setup can vary widely pending the business need defined at the outset, data available and audience for the topic. One helpful feature offered by QuickSight that helps with this is Row Level Security available natively within topics. When there is Row Level Security enforced on a QuickSight dataset, and a topic is built using that dataset, the user will only receive insights about data they have access to, with no additional setup required.

Choosing the Right Data

Once you have a clear understanding of common stakeholder data questions, pain points, and how they are talking about their data, you will then naturally find the data sources containing the information best equipped to answer them. The data you choose will be the bedrock of your Amazon Q in QuickSight solution.

Setting Up Topics

QuickSight will perform an initial ‘cleaning’ of your data on its own, assigning common synonyms to field names (e.g., a field called ‘Cost of Goods Sold’ will be given the synonym ‘COGS’ automatically). You will want to double check this for accuracy, as it can make mistakes.

Once data is selected, QuickSight will prompt you to ‘Start Review’ of the topic data. This is almost always your immediate first step. Proper data cleaning and preparation are crucial steps to ensure that Q in QuickSight provides accurate, actionable insights that align with your business goals.

Field Selection

Review each field to ensure that only relevant and high-quality data is included. Consider which fields are adding value to the analysis, and which are adding noise or aren’t useful for the analysis. Include and exclude fields as needed.

Synonyms

Add synonyms to fields to accommodate different terminologies used by your business users. This will make the topic (and results) more user-friendly and intuitive.

Data Roles and Formatting

Define roles (such as dates, measures, dimensions, etc.) and apply consistent formatting. You can manually set fields to be formatted as Currency, which will maintain the clarity and overall quality of the insights.

Data Aggregations

Set up the data aggregations of measures to set up meaningful summarizations. You can set default aggregations (e.g., a ‘Sales’ field can be set to SUM or AVERAGE automatically) and disallow other aggregations that either don’t make sense or are potentially sensitive.

Testing, Deployment and Beyond

Amazon Q in QuickSight encourages continued evaluation and performance monitoring after a topic is deployed to end users. For each topic, you can see how many questions have been asked and what they are, and view user feedback of their responses. Each time a user asks a question, they have the option to give the response a thumbs up or thumbs down, and give long text feedback.

End users should be highly encouraged to interact with the results they get from Amazon Q. Their input is extremely valuable and can be used to determine what is working well with the topic and what could use improvement. Additionally, you can see how many questions have been asked in total, what portion of those questions were answerable, and what portion weren’t. Use these metrics to assess engagement with the solution and calibrate or update the topics accordingly.

Listening to Feedback

From the outset, deploying Amazon Q in QuickSight works best as a collaborative process. Integrating it into your BI ecosystem is best done by working closely with the target end users at all stages of development, from ideation to deployment. Their expectations and usage of the tool will be the biggest variable impacting the success of the project, and their continued engagement after deployment will ensure issues are identified, fixed and maintained.

Negative feedback from users should always be addressed and corrected quickly. If end users find many of their questions unanswerable or are consistently getting unexpected results from the topic, it can lead to frustration and abandonment of the tool. Remember, the value of Q in QuickSight is being able to get the information you need quickly and easily. If users are using synonyms that the development team had not considered, if their queries are too complex, or irrelevant to the topic, they will have trouble getting the results they are looking for.

A great way to start users out using Amazon Q in QuickSight is with Suggested Questions. These are questions shown to users by default when they open a topic in the Q in QuickSight interface within the QuickSight console. These can be validated by the developers of the solution. Questions can be manually validated by the developer of the topic and shown to end users by default. These can also be re-validated periodically as the data refreshes. End users are shown the last validation time.

Setting Expectations

Given the rise in popularity of other AI-powered tools, it’s important to set user expectations about what Amazon Q in QuickSight is and isn’t useful for. It is designed for quick, high-level insights while also giving a user the ability to dig deep to answer specific questions about their data. Questions posed to a Topic should always be pertinent to the data in the topic.

Amazon Q in QuickSight is not a replacement for mission-critical dashboards, nor is it a general purpose, world-wise AI virtual assistant. Situations involving intricate calculations or carefully formatted reports are better carried out in a QuickSight Analysis. Additionally, while Amazon Q in QuickSight streamlines the querying process, it still requires well-prepared and clean data to function effectively.

Final Thoughts

Amazon Q in QuickSight is an AI solution completely integrated into a full-fledged BI tool. It is secure, and enables end users to perform their own ad-hoc reporting done at their convenience. By empowering users to query data directly and intuitively, Amazon Q in QuickSight reduces the load placed on technical teams, accelerates decision-making processes, and fosters a more data-driven culture within the organization. These make Amazon Q in QuickSight a valuable asset for any organization looking to enhance its reporting capabilities and drive positive business outcomes.. 

The AWS Summit New York 2024 was an exhilarating event to showcase cloud innovation, AI advancements, and industry best practices. At this action-packed day hosted at the Jacob K. Javits Convention Center, this year’s Summit brought together thousands of professionals, technology enthusiasts, and AWS experts to explore how cutting-edge AWS technologies can be used to revolutionize industries and empower businesses.

At this year’s Summit, over 170 sessions were offered covering a wide range of topics and technical depth, ranging from level 100 (foundational), level 200 (intermediate), level 300 (advanced), and level 400 (expert). Within these sessions, many AWS experts, builders, customers, and partners shared their insights on numerous topics such as generative AI, analytics, machine learning, industry specific solutions, and many more. Individuals were able to customize their own agenda ahead of time and choose from lecture-style presentations, peer-led discussions, and explore the Expo to learn about the numerous advancements of AWS technologies and deepen understanding of best practices. Dr. Matt Wood, VP for AI Products, AWS, hosted the keynote session to unveil the latest launches and technical innovations from AWS and demonstrate products and real-world success stories from AWS customers.

Below is a detailed look at some of my key takeaways and trends that summarizes this year’s Summit:

1. Amazon Bedrock

Stemming from the heavy emphasis on generative AI and its capabilities, one of the most exciting announcements from the Summit was the introduction of new capabilities in Amazon Bedrock. Amazon Bedrock is AWS’s relatively new service designed to simplify the creation of AI applications. The service provides access to pre-trained foundation models from leading AI providers, and enables businesses to build, deploy, and scale AI-driven solutions without deep expertise and extensive effort. In addition, the many key features of Amazon Bedrock allow users and businesses to build innovative AI solutions effectively and efficiently while ensuring scalability and compliance. The fundamental idea of this service is to revolutionize how companies develop and deploy generative AI applications, making it easier to integrate cutting-edge technology into existing workflow while significantly reducing computational costs. 

At this year’s Summit, additional features of Amazon Bedrock were introduced to enhance company knowledge bases with new Amazon Bedrock connectors for Confluence, Salesforce, SharePoint, and web domains. In doing so, companies can empower RAG models with contextual data for more accurate and relevant responses. 

Lastly, Guard Rails and Guard Rails API were introduced for Amazon Bedrock to contribute to the following:

  • Bring a consistent level of AI safety across all applications
  • Block undesirable topics in generative AI applications
  • Filter harmful content based on responsible AI policies
  • Redact sensitive information (PII) to protect privacy
  • Block inappropriate content with a custom word filter
  • Detect hallucinations in model responses using contextual grounding checks

Businesses and customers can apply safeguards to generative AI applications even if those models are hosted outside of AWS infrastructure. It is estimated that up to 85% of harmful content can be reduced with custom Guardrails.

2. Fannie Mae’s Data Science Platform

One of the first sessions that I attended was Fannie Mae’s presentation on their data science platform. The focus was on how Fannie Mae overcame traditional data management challenges through innovative solutions. Data scientists at Fannie Mae were responsible for exploring internal and external datasets, including sensitive data to develop and train models, create reports and new datasets, deploy models, and share insights. Before the utilization of AI, Fannie Mae’s data scientists struggled with data access (mostly personally identifiable information), governance, and operationalization. In addition, underwriting analysts spent significant time extracting structured data from unstructured documents. On average, each analyst spent 5 hours on every document, with over 8,000 underwriting documents per year. The challenge of inefficient manual document analysis was also resolved by the utilization of AI.

By leveraging Large Language Models (LLMs) and ontologies, Fannie Mae developed a knowledge extraction system that significantly reduced manual effort. Tools like Amazon Bedrock, Claude 3 Sonnet, Amazon Neptune, LangChain, and Amazon OpenSearch Service played a crucial role in this transformation. The use of AI has generated a potential savings of over 32,000 hours annually and improvements in accuracy, compliance, and scalability of underwriting analysis for Fannie Mae.

Such efficiency and savings generated by the use of LLMs and ontologies is simply fascinating. This is a great reflection on how companies of all sectors can utilize the diverse capabilities of AI and customizable machine learning models to generate value.

3. IBM WatsonX & AWS: Scale Gen AI Impact with Trusted Data

Generative AI was a major theme at the Summit, and IBM WatsonX and AWS highlighted their collaborative efforts to expand the impact of this technology. The WatsonX suite offers tools like Watsonx.ai for model development, Watsonx.data for scaling AI workloads, and Watsonx.governance for ensuring responsible AI practices. This partnership brings a shift towards more open, targeted, and cost-effective generative AI solutions, while offering superior price-performance at less than 60% of the traditional costs.

4. Advancing AI and Cloud Solutions

Another key topic of the Summit was Innovating with Generative AI on AWS. This topic highlights how businesses can focus on performance, cost-efficiency, and ethical responsibilities in AI development. Many strategies were discussed for creating new customer experiences, boosting productivity, and optimizing business processes through generative AI.

Some of the key techniques included Retrieval Augmented Generation (RAG) for combining new and existing information, fine-tuning of AI models, and pre-training to enhance AI capabilities. The session emphasized the importance of accessible and high-quality data as the foundation for AI success, so that businesses can utilize generative AI to its maximum potential to drive innovation and create value. By using services designed to enable innovation and scale, businesses are able to measure and track value and ROI while optimizing for cost, latency, and accuracy needs. In addition, businesses can manage risk, maintain trust, and build with compliance and governance.

5. Boosting Employee Productivity with AI Agents

Another highlight was the exploration of AI agents powered by Amazon Q. With Amazon Q, businesses can design these AI agents to integrate seamlessly with tools like Slack, Microsoft Teams,  and other AWS-supported data sources to enhance employee productivity. These AI agents can improve efficiency across teams and organizations by streamlining data interactions and automating repetitive tasks. A demo of how to connect the Slack instance to Amazon Q and deploy it into the Slack workspace showed the simplicity of the whole process and how quick Amazon Q can generate value for an organization.

6. Building a Strong Data Foundation for Generative AI

A central theme at the Summit was the importance of a solid data foundation for successful generative AI initiatives. AWS demonstrated how businesses can harness structured and unstructured data through various tools and services. Key components of this foundation include:

  • Data Storage: Managing structured and unstructured data using SQL, NoSQL, and graph databases
  • Data Analytics: Utilizing data lakes for search, streaming, and interactive analytics
  • Vector Embeddings: Tokenizing and storing data for semantic similarity searches
  • Data Integration: Combining data from different sources using tools like AWS Glue and Amazon DataZone.

7. Governance and Compliance in the Cloud

Governance and compliance were also significant topics, with AWS highlighting how organizations can manage data securely and efficiently. Enterprise customers look for democratized data tools with built-in governance to discover, understand, and access data across organizations, with the ability for multiple personas to collaborate on the same data problems. In addition, easy-to-use and easy-to-access analytics and BI tools are crucial for value creation. The Summit showcased services like AWS IAM, Amazon Cognito, AWS Lake Formation, and Amazon S3 for data management, access control, and auditing. These tools help ensure that cloud operations are compliant with regulations and best practices

8. The Future of Generative AI

Lastly, the Summit concluded with a discussion on the future of generative AI. The evolution of AI agents such as Ninjatech.AI, multimodal models, and new regulations were some of the topics that were discussed. The session also explored the balance between value and feasibility in AI projects. It is crucial to identify the value generated from productivity, experience, and revenue, but also focus on the need for innovation that is both effective and sustainable.

The AWS Summit New York 2024 highlighted the latest advancements in cloud technology and AI. One of the major releases, Amazon Bedrock, allows businesses to build, deploy, and scale AI-driven solutions without extensive expertise and effort. This promotes businesses to focus more on performance, cost, and ethical responsibilities with gen AI.

The Summit offered valuable insights and tools for businesses looking to leverage cloud computing for innovation and efficiency. Many case studies were showcased to further support the adoption of generative AI in businesses of all sectors and instances where generative AI can create value for all aspects of the business. The sense of urgency to adopt gen AI has doubled since last year, and the emphasis to build a solid data foundation for successful generative AI initiatives has never been greater. The many new innovations simplifies the process for businesses to leverage data to create and differentiate generative AI applications, and create new value for customers and the business. The phrase “Your data is the differentiator” should be remembered as businesses navigate through the AI journey. 

Overall, the AWS Summit provided a comprehensive look at how AWS is shaping the future of technology. With a strong emphasis on AI and machine learning advancements, security enhancements, and sustainability efforts, the future has never looked so bright for businesses, developers, and consumers. 

What causes advanced analytics and AI initiatives to fail? Some of the main reasons include not having the right compute infrastructure, not having a foundation of trusted data, choosing the wrong solution or technology for the task at hand and lacking staff with the right skill sets. Many organizations deploy minimum valuable products (MVP) but fail to successfully scale them across their business. The solution? Outsourcing elements of analytics and AI strategy in order to ensure success and gain true value.

64% of leaders surveyed said they lacked the in-house capabilities to support data-and-analytics initiatives. 

It’s essential to implement a data-driven culture across your organization if you’re looking to adopt advanced analytics. One of the keys to a data-driven culture is having staff with the correct skills that align with your initiatives. In our study, 64 out of 100 leaders identified a lack of staff with the right skills as a barrier to adopting advanced analytics within their organization. Even for organizations that do have the correct skill sets, retaining that talent is also a barrier they face. This is where outsourcing comes in.

Borrowing the right talent for only as long as you need it can be an efficient path forward.

Outsourcing parts of your analytics journey means you’re going directly to the experts in the field. Instead of spending time and money searching for the right person both technically and culturally, outsourcing allows you to “borrow” that talent. The company you choose to outsource to has already vetted their employees and done the heavy lifting for you. With outsourcing, you can trust that your organization is working with professionals with the skill sets you need.

Aside from securing professionals with the correct skill sets, there are plenty of other benefits to outsourcing your organization’s analytics needs. Professionals with the skill sets necessary for advanced analytics and AI initiatives can be very expensive. Outsourcing provides a cost-effective option to achieve the same goal. Rather than paying the full-time salary and benefits of a data science or analytics professional, an organization can test the value of these kinds of ventures on a project to project basis and then evaluate the need for a long-term investment.

Freeing full-time employees to make the most of their institutional knowledge.

Another benefit of outsourcing analytics is the increased productivity and focus of your organization’s full-time employees. By outsourcing your organization’s analytics, your full-time employees will naturally have more bandwidth to focus on other high priority tasks and initiatives. Rather than spending their time on what the outsourcing company is now working on, the full-time employees can dedicate their time to work on things that may require institutional knowledge or other tasks that are not suited for a third party. It’s a win-win situation for your organization – your analytics needs are being handled and your full-time staff is more focused and still productive.

There are many areas of analytics that an organization can outsource. These areas include but are not limited to viability assessments, prioritization of use cases, managing the ongoing monitoring, performance, maintenance and governance of a solution and implementing and deploying an MVP or use case.  In the words of Brian Platt, Ironside’s Practice Director of Data Science, “A partner with advanced analytics and data science capabilities can rapidly address AI challenges with skills and experience that are hard to develop in-house.”

Mid-tier organizations need the right talent and tools to successfully realize the value of their data and analytics framework in the cloud. The Corinium report shows that many companies are increasingly prepared to work with cloud consulting partners to access the skills and capabilities they require. 

 Areas that mid-market leaders consider outsourcing.

Overall, more and more data leaders are turning to outsourcing to help fill the gaps and expedite their organization’s analytics journey. Outsourcing services can help your organization reach analytics goals in many different areas, not just AI and Advanced Analytics. 

Organizations rely on outsourcing in key areas like these:

  • Developing a data and analytics cloud roadmap
  • Assessing advanced analytics use cases (figure shows 68% would consider outsourcing)
  • Implementation and deployment of a MVP, or use case (figure shows 43% outsource)
  • Developing and maintaining data pipelines
  • Documenting and assessing your BI and overall analytics environment(s)
  • Migrating your reporting environment from one technology to another
  • Overall management and monitoring of analytics or AI platform (figure shows 42% are already outsourcing)

When your company plugs into the right skill sets and processes, there’s nothing between you and a successful data-and-analytics transformation.

Take a look at the full whitepaper to learn more: Data Leadership: Top Cloud Analytics Mistakes – and How to Avoid Them

Contact Ironside Group today to accelerate your Advanced Analytics and AI Strategies.

Midmarket companies use Advanced Analytics and AI to automate processes, glean strategic insights and make predictions at scale such as:

  • Marketing – What is the next best offer for this client? 
  • Customer churn – Will this customer churn soon?
  • Predictive maintenance – When will this machine or vehicle fail?
  • Insurance- Will this person file a large claim?
  • Healthcare – Will this person develop diabetes?

Companies can wait until their competitors, or new entrants leverage AI in their industry, or they can start the process now.  There’s no doubt that the coming years will see AI applied to ever-increasing processes in the organization.  The urgency is to start reaping the benefits before widespread adoption in your industry occurs. 

The good news is that midmarket companies are still in the early stages of large-scale deployment of AI projects.  A recent survey by Corinium Intelligence (Data Leadership: Top Cloud Analytics Mistakes – and How to Avoid Them) found that only 4% of respondents say their advanced analytics models and self-service tooling are fully scaled and integrated with business processes across the organization.

However, midmarket companies are actively scaling and experimenting with AI and Advanced Analytics in their business processes.  The survey found that 53% are creating MVPs (Minimum Viable Products) and 36% are in the process of scaling advanced analytics and AI, well on their way to deployment.

AI adoption will transform business models over 2-5 years. The time to start is now.

What challenges do midmarket companies face as they define, build and deploy Advanced Analytics and AI technologies in their companies?

The Corinium Intelligence survey asked mid-market companies about the biggest mistakes they saw or experienced in deploying Advanced Analytics and AI.  This survey of 100 data and analytics leaders from the financial services, insurance, telecoms, retail, and manufacturing sectors highlights the challenges enterprises face at each step of the data modernization journey – from designing the right data architecture to incorporating AI in business processes for competitive advantage.

59% of respondents cited inadequate data and compute infrastructure as the leading impediment.  Choosing the right technologies, hiring the right skill sets and proactively investing in change management are the next three sources of mistakes on the path to utilizing the AI/Advanced Analytics. 

Choosing the wrong analytics or AI technology solutions can result in setbacks later on. It’s important to carefully consider the various analytics and AI solutions that are available and choose the one that best meets the needs of the organization.

Successful analytics and AI projects require a range of technical and domain-specific skills. 54% of survey respondents said it was important that the necessary skills and expertise are available within the organization, or that they can be acquired through training, hiring and partnering.  In fact, many mid-tier companies bring in external expertise to implement AI and advanced analytics.

Almost half of the respondents identified failure to invest in change management as another risk. Analytics and AI projects can involve significant changes to processes. It’s important to proactively identify cultural and organizational success factors.  This includes getting executive buy-in, aligning analytics and AI strategy with business goals and communicating the value of analytics and AI projects to the rest of the organization, in order to build support and ensure successful adoption.

The stakes are high.  The mistakes leaders cited led to significant or total disruption of Advanced Analytics and AI strategies.  These challenges can delay realizing the business benefits, delay advantages against competitors or hamper defending against new entrants who use Advanced Analytics and AI.

What are the options when building a world class advanced analytics and AI capability in my organization?

Three paths that companies follow include:

1. Build the capability in-house

2. Buy third-party solutions

3. Partner with cloud consultants to accelerate customized advanced analytics/AI solutions

In summary, building Advanced Analytics/AI in-house offers greater control and the ability to tailor solutions to specific business needs, but it can be costly and time-consuming. Buying third-party solutions is quicker and less expensive, but it offers less control and limited ability to tailor solutions. Partnering with a cloud consultant can be a good middle ground as it provides a combination of in-house and third-party expertise and the ability to tailor solutions to specific business needs, but it is more expensive than buying pre-built solutions. Whichever path you choose, the benefits of advanced analytics and AI are well within your reach.

Take a look at the full whitepaper to learn more: Data Leadership: Top Cloud Analytics Mistakes – and How to Avoid Them

Contact Ironside Group today to accelerate your Advanced Analytics and AI Strategies.

Success in the Cloud | Part 1 of 5

Cloud-based data analytics allows organizations to derive value from their data in ways that traditional on-premises solutions cannot. Organizations globally have already figured this out, and are now on journeys to realize the potential of analytics in the cloud. Most are about halfway through their cloud journey. What remains for most is getting past the “lift and shift” mentality that characterizes some cloud migrations, and educating higher-ups on the benefits cloud infrastructure can yield — ultimately driving buy-in.

While cloud migrations are well underway, few companies have completed the journey.

There are very few mid-market organizations utilizing cloud analytics to their potential. In a survey of 100 mid-market organizations, 66% are about halfway through their cloud transformation journeys; the number reporting fully modernized cloud-based data ecosystems was only 6%.



Lack of stakeholder alignment is a primary obstacle to successful transformation.

How can the 66% close the gap? Ultimately it is about understanding and being able to communicate the benefits of such a cloud transformation. What inhibits cloud adoption is not lack of value, it is misunderstanding.

The pandemic provided a live test case for the usefulness of cloud-enabled technology. Suddenly teams were separated by geography, unable to communicate in person. For the marketplace, this meant being able to access data remotely was no longer a nice-to-have, it was a legitimate competitive advantage that allowed prepared businesses to continue operating.

Beyond closing distance, organizations are discovering cloud enablement to be a useful lever of data governance. By its nature, data stored in the cloud is highly available and therefore easily accessible. Therefore, if a company has workers in Maryland, Alaska, and Hawaii, with the right preparation, access to sensitive data can still be distributed, controlled, and monitored as easily as on-premises infrastructure. Moreover, having one central source of truth can remedy the problem of siloed-off logic, “rogue spreadsheets,” and nebulous business logic.

Today these tools remain incredibly valuable. While the pandemic has abated some, the reality of remote teams remains, leaving cloud-enabled tools to bridge physical divides.

Common roadblocks mid-market companies encounter. 

Many organizations experience the same roadblocks on the road to fully realizing the value of the cloud. Among them are missed opportunities to revise data practices during implementation, reliance on outdated data management habits, and a non-uniform distribution of value-add within the organization itself. 

Evaluation of business process

Moving on-premises data to the cloud lends itself neatly to an evaluation of business processes. What is working well? What is causing problems? And what could cause problems down the road? To squarely miss gleaning the valuable insights from the answers, one might take a lift-and-shift approach to their cloud migration.

Lift and Shift

This approach misses a lot of opportunities. A lift-and-shift cloud migration (also called rehosting) is a 1:1 replication of on-premises data storage models, schemas, and methodologies on a cloud platform. When finishing a lift-and-shift, an organization will have the exact same business logic, databases, and workflows they had before, now on the cloud. This is complete with all the same advantages, bugs, and flaws as the prior system.

Data Management Practices

Outdated data management practices, migrating old and unused data, and a lack of education can hamstring a cloud migration. The lift-and-shift approach without evaluation of internal processes can lead to unneeded costs for the organization, and sap the sustainability from the new system. For example, is it worth moving a 6 GB Excel sheet last accessed in 2003 to the cloud? Or can that file be removed? These are the questions organizational introspection before migration can help to answer.

Non-Uniform Value Add Distribution

A key aspect of planning a successful cloud migration involves managing expectations. Making an organizational shift to cloud data storage won’t necessarily lead to uniform distribution of value-add across an entire organization. A data science team will find more value in a cloud-hosted data warehouse than an inside sales team. However, linking teams of various business functions to the same data warehouse facilitates synergy between them, creating value for all.

Executive Buy-In

The features of a cloud migration are important, but the biggest major obstacle to cloud adoption continues to be executive buy-in. Analytics leaders across industries consistently cite a lack of buy-in from leadership, specifically securing the budget for such a cloud transformation, as their biggest roadblock.

Communication about expectations, costs, and most importantly aligning cloud migrations with organizational goals (in both the short term and long term) is the most important tool for any analytics leader seeking to make the jump to the cloud.

The importance of cleaning house before you make the move.

Moving from on-premises to cloud-based data storage is not at all like flipping a switch. Instead, a cloud migration should be treated as an investment in existing business logic. There are limitations to hosting data on-premises that can’t be overcome due to the nature of the technology, and the way to maximize the existing business logic is by moving it to a new, more available platform.

A cloud migration is something that should only be done once. While a lift-and-shift will technically work, the organization misses the opportunity to improve processes, communication, and do some much-needed “housekeeping” of their data. This is hard work in the immediate term (during the transformation) but will quickly pay off in the short term, with long-term rewards that will compound over time. 

Take a look at the full whitepaper to learn more: Data Leadership: Top Cloud Analytics Mistakes – and How to Avoid Them

Ironside is an Enterprise Data and Analytics firm and Advanced AWS Partner specializing in building innovative solutions leveraging AWS native analytics services. In a recent project, we worked with Homer Learning to build and launch a solution leveraging Amazon QuickSight to assist their marketing department gain greater visibility into the attribution and conversion of digital marketing spend. 

As a provider of digital education products to children via mobile and web, recent changes by the major industry ecosystem vendor data privacy terms & conditions (Apple & Google) have made tracking usage of Homer’s products very challenging. For the growth of their business, they needed to understand which digital advertising and marketing efforts were converting new customers and driving user consumption. 

Partnering with Homer’s data and analytics team, Ironside engaged to implement Amazon QuickSight Dashboards and Reports sourced from their data lake of advertising spend and user product usage information. The solution required close coordination with various business users within their marketing department and Homer analytics technical leadership to determine the effectiveness of advertising spend for both new user acquisition and user attention. 

Graphical user interface, chart

Description automatically generated

Exhibit A:Homer Learning Marketing Attribution Amazon QuickSight Dashboard and Reporting

Ironside’s Practice Lead for Business Intelligence, Scott Misage, shared, “The Homer Learning solution is interesting as it brings the headlines in the newspaper to customers engagement with the requirements, with Homer leveraging AWS to house their data analytics platform, Amazon QuickSight ”

Understanding the data elements from their variety of advertising and product platforms is essential for Homer’s marketing decision makers and is what Amazon QuickSight delivers. Ironside worked closely with business users to understand how they were looking to consume the data and align that to traditional and advanced features within Amazon QuickSight. Jin Chung, Sr. Architect, Analytics Platform at Homer shared, “The Ironside team worked closely with our business stakeholders to understand how they have interacted with the data previously and put forward solutions that could enhance that experience with some of the new features in Amazon QuickSight.” 

The Homer Amazon QuickSight environment is integrated to many other AWS analytics and management platform services that provide data processing and security capabilities. A key component of the platform is the aggregation of 3rd party data delivered to Homer via AWS S3 and blended in Databricks Delta Lake.  Ironside worked to create a secure and functional solution that integrated QuickSight to the Delta Lake with AWS Athena. 

About Ironside

Ironside helps companies translate business goals and challenges into technology solutions that enable insightful analysis, data-driven decision making and continued success. We help you structure, integrate and augment your data, while transforming your analytic environment and improving governance.

About Homer

The journey of parenthood begins without a map. As parents, we want the best for our kids. We want them to grow up to be confident lifelong learners who are ready to take on the world. At HOMER, our purpose is to give kids the best start to their learning journey during the window of opportunity—before the age of 6—where 85% of brain development takes place. We guide and champion children through this pivotal time as they build their skills and deepen their love of learning, and we partner with parents to provide the support that all kids need.


Your data needs are different from those of any other client we’ve worked with. Plus, they’re ever-changing. 

That’s why we’re fluid in our approach to creating your framework and why we ensure fluidity in the framework itself. 

Diagram

Description automatically generated

Whether your current investment in assessments, governance, and technology is heavy or light, we can meet you where you are, optimize what you have, and help you move confidently forward. 

These steps are all necessary, but don’t happen in a strict sequence. Each of them is an iterative process — taking small steps, looking at the results, then choosing the next improvement. You need to start with assessment and governance — unless you already have some progress in those areas. 

Analytics are constantly evolving, and the Modern Analytics Framework is designed to evolve more readily as users discover new insights, new data, and new value for existing data. There will be constant re-assessment of the desired future state, modifications to your data governance goals and policies, design of data zones, and implementation of analytics and automated data delivery. Making these changes small and manageable is a key goal of the Modern Analytics Framework.

Can we ask you a few questions?

The better we understand your current state, the better we can speak to your specific needs. 

If you’d like to gain some insight into how your organization can move most effectively toward a Modern Analytics Framework, please schedule a time with Geoff Speare, our practice director.

Geoff’s Calendar
GSpeare@IronsideGroup.com
O 781-652-5758  |  484-553-1814

Get our comprehensive guide.

Learn about our proven, streamlined approach to taking your current analytics framework from where it is to where it needs to be, for less cost and in less time than you might imagine.

Download the eBook now

Check out the rest of the series.

Your data needs are different from those of any other client we’ve worked with. Plus, they’re ever-changing. 

That’s why we’re fluid in our approach to creating your framework and why we ensure fluidity in the framework itself. 

Diagram

Description automatically generated

Whether your current investment in assessments, governance, and technology is heavy or light, we can meet you where you are, optimize what you have, and help you move confidently forward. 

These steps are all necessary, but don’t happen in a strict sequence. Each of them is an iterative process — taking small steps, looking at the results, then choosing the next improvement. You need to start with assessment and governance — unless you already have some progress in those areas. 

Analytics are constantly evolving, and the Modern Analytics Framework is designed to evolve more readily as users discover new insights, new data, and new value for existing data. There will be constant re-assessment of the desired future state, modifications to your data governance goals and policies, design of data zones, and implementation of analytics and automated data delivery. Making these changes small and manageable is a key goal of the Modern Analytics Framework.

Can we ask you a few questions?

The better we understand your current state, the better we can speak to your specific needs. 

If you’d like to gain some insight into how your organization can move most effectively toward a Modern Analytics Framework, please schedule a time with Geoff Speare, our practice director.

Geoff’s Calendar
GSpeare@IronsideGroup.com
O 781-652-5758  |  484-553-1814

Get our comprehensive guide.

Learn about our proven, streamlined approach to taking your current analytics framework from where it is to where it needs to be, for less cost and in less time than you might imagine.

Download the eBook now

Check out the rest of the series.

In the same way that Software as a Service eliminates the need to install applications on your local network, Data as a Service lets you avoid storing and processing data on your network. Instead, you can leverage the power of cloud-based platforms that can handle high-speed data processing at scale. Combine that with the ready availability of low-cost cloud storage, and it’s easy to appreciate why so many organizations are turning to Data as a Service. 

Graphical user interface

Description automatically generated with medium confidence

One key component of a modern analytics framework.

In Ironside’s Modern Analytics Framework, Data as a Service is one of 3 key components.

Diagram

Description automatically generated

How can Data as a Service serve your organization?

We know your time is valuable. So, let us speak to your specific needs. 

Schedule a time with Geoff Speare, our practice director.

Schedule a time with Geoff Speare, our practice director:

Geoff’s Calendar
GSpeare@IronsideGroup.com
O 781-652-5758  |  484-553-1814

Get our comprehensive guide.

Learn about our proven, streamlined approach to taking your current analytics framework from where it is to where it needs to be, for less cost and in less time than you might imagine.

Download the eBook now

Check out the rest of the series.

If you rely solely on a data warehouse as your repository,  you have to put all your data in the warehouse–regardless of how valuable it is. Updating a data warehouse is more costly. It also takes a lot of time and effort, which usually leads to long delays between requests being made and fulfilled. Analytics users may turn to other, less efficient means to get their work done.

If you rely solely on a data lake, you have the opposite problem: all the data is there, but it can be very hard to find and transform it into a format useful for analytics. The data lake drastically reduces the cost to ingest data, but does not address issues such as data quality, alignment with related data, and transformation into more valuable formats. High value data may reside here but not get used.

When you have a system of repositories with different levels of structure and analysis, and a value-based approach for assigning data to those repositories, you can invest more refinement and analytics resources in higher-value data.

Striking the right balance between refinement and analytics is key. Performing analytics on unrefined data is a more costly, time-consuming process. When you can identify value upfront, you can invest in refining your high-value data, making analytics a faster, more cost-efficient process. 

Our value-based approach can help deliver higher ROI from all your data.

A picture containing diagram

Description automatically generated

This value-based approach also helps your modern analytics framework better meet the needs of your knowledge workers. For example, analysts can jump into complex analysis, rightly assuming that high-value data is always up to date. In addition, automated value delivery automatically distributes high-value data in ways users can act on. 

Let’s invest in a conversation.

We want to hear about your current framework and your changing needs. 

Schedule a time with Geoff Speare, our practice director:

Geoff’s Calendar
GSpeare@IronsideGroup.com
O 781-652-5758  |  484-553-1814

Get our comprehensive guide.

Learn about our proven, streamlined approach to taking your current analytics framework from where it is to where it needs to be, for less cost and in less time than you might imagine.

Download the eBook now

Check out the rest of the series.