Business intelligence is a necessity for any organization that wants to be able to make informed decisions based on the data that they collect. Unfortunately, it is common for different portions of the business to build their reports with different assumptions, leading to conflicting views and poor choices. Looker is a modern tool for building and sharing reports that makes it easy to get everyone on the same page. In this episode Daniel Mintz explains how the product is architected, the features that make it easy for any business user to access and explore their reports, and how you can use it for your organization today.
- Hello and welcome to the Data Engineering Podcast, the show about modern data management
- When you’re ready to build your next pipeline you’ll need somewhere to deploy it, so check out Linode. With private networking, shared block storage, node balancers, and a 40Gbit network, all controlled by a brand new API you’ve got everything you need to run a bullet-proof data platform. Go to dataengineeringpodcast.com/linode to get a $20 credit and launch a new server in under a minute.
- Go to dataengineeringpodcast.com to subscribe to the show, sign up for the mailing list, read the show notes, and get in touch.
- Join the community in the new Zulip chat workspace at dataengineeringpodcast.com/chat
- Your host is Tobias Macey and today I’m interviewing Daniel Mintz about Looker, a a modern data platform that can serve the data needs of an entire company
- How did you get involved in the area of data management?
- Can you start by describing what Looker is and the problem that it is aiming to solve?
- How do you define business intelligence?
- How is Looker unique from other approaches to business intelligence in the enterprise?
- How does it compare to open source platforms for BI?
- Can you describe the technical infrastructure that supports Looker?
- Given that you are connecting to the customer’s data store, how do you ensure sufficient security?
- For someone who is using Looker, what does their workflow look like?
- How does that change for different user roles (e.g. data engineer vs sales management)
- What are the scaling factors for Looker, both in terms of volume of data for reporting from, and for user concurrency?
- What are the most challenging aspects of building a business intelligence tool and company in the modern data ecosystem?
- What are the portions of the Looker architecture that you would do differently if you were to start over today?
- What are some of the most interesting or unusual uses of Looker that you have seen?
- What is in store for the future of Looker?
- From your perspective, what is the biggest gap in the tooling or technology for data management today?
- Business Intelligence
- Data Warehouse
- ETL (Extract, Transform, Load)
- ELT (Extract, Load, Transform)
- Data Curation Episode
- DRY (Don’t Repeat Yourself)
- Looker Action Hub
- Netscape Navigator
- Dynamic Pricing
- Survival Analysis
- BigQuery ML
- Snowflake Data Sharehouse