Saturday, November 11, 2023

Looker and GitLab

 Looker is a category-defining enterprise analytical tool from Google Cloud and Git is the world-class versioning tool. As Looker provides IDE to develop content using LookML language, GitHub, GitLab are web based platforms that uses Git to version the code and also provide advanced collaboration and DevOps features.


Recently I ran into an issue where I was successfully able to migrate code from my personal GitLab branch to the master branch (dev environment). But when I tried to merge my changes with the test branch for UAT, I got error that said "The source branch is 2 commits behind the target branch". I could not understand it as the test branch is protected and no one could modify it directly. 


With bit of learning Git and the way merging and squashing works I realised the cause behind it. The first time I tried to merge my changes from master (dev) to test branch (UAT), it went all fine. While creating that merge request I checked the option "Squash all commits" in GitLab. This caused all the different commits in master branch to be combined in a single commit with new hash value. Then the second time I tried to create a merge request I chose to not select the squash all commits option. This made GitLab to compare the master and test branch and check that the commits with same hash exists in both the places.

Ideally I would not need all the commits of master branch in the test branch, so as to keep it relatively clean. But I would want all the commits to exist in the master (dev) ranch. So, while creating the merge request to merge code from my personal branch to master, I need to ensure that I do not check the squash commit. But when I create merge request to merge code from master to test, then I need to select the squash option so that all the commits of master goes in to test as a single commit with new commit and hash value. But the challenge I will face is that it works for the first merge into test but the in the next merge, GitLab will not allow merging as the hash value between master and test will not match.



The solution I got working was to revert the earlier successful merge into test and then create a new merge request. This time I did not select squash all commits and though it showed the message "source is 2 commits behind the target" (which is the first successful merge with new has value and second merge for reverting the first merge with its own hash value). This approach will cause the test branch to contain all the commits from master branch but will work just fine.

#Looker #LookML #Git #Gitlab #Merge




Monday, October 2, 2023

TCS at forefront of Transforming Analytics on Google Cloud





Google and TCS have always been in the forefront of the connecting the dots between Industry's changing needs and enabling them with latest technological advancements.

One of such ongoing effort is working in the Google Looker campaign where we engage closely with the customers to understand their data transformation journey and match their data exploration requirements on google cloud.




The most critical guiding principle of a cloud based data warehouse transformation journey is on democratising the insight delivery across business applications, decision makers and organisational hierarchy. TCS Business Intelligence Modernisation on Google Cloud addresses all the facets of it in a comprehensive way.




The solutions takes into account the transition from legacy BI products that customers might be using and hence starts by analyzing the technical debt generated by them and also identifies the opportunities that business can enable using the different capabilities provided on the cloud. It helps in transforming the analytical landscape by not just letting users create reports and visualizations but embedding intelligence in the business workflows and applications in a governed, secured and scaled manner. 





Monday, May 22, 2023

Google Looker - Its time has come.

 

History of the Business Intelligence Tools and how they evolved?




Typically the advancements in hardwares and softwares paves way for a new breed of BI Tools. Like when hardware was limited in terms of network bandwidth, most of the insight delivery happened thru publishing of content over a website in a one way channel. As the bandwidth improved we saw tools in client-server architecture where data was packaged in a multi-dimension cube and client tools were used for analysis. Later as network bandwidth kept increasing and cost of computer hardware (RAM and CPU) declined with inversely rising performance, there came around server-client-web based tools where development was done in client-server mode and delivery was done server-web app mode. Then suddenly there was a kind of disruption in computer hardware where high performance CPU and RAM were available in portable devices and data storage technologies where columnar storage became a rage in giving phenomenal performance and it paved way to very high performance visualisation tools that could sit on GBs of data on a laptop and let you create really complex and appealing visualisations.

With this evolution, the product companies that got traction, typically focussed on either data storage (database) or data processing (ETL) or data analysis (DW and analytics) and there was a kind of monopoly among them. The customers usually had use different proprietary products to be able to use the best in-class features, but also had to bear with limitations around integration, automation, scaling, different expensive licensing cost and compatibility issues. Then came the biggest disruption of our times named cloud computing.



The cloud disruption driven by the drastic reduction in capex cost of apps and infra, ease of managing them and a bouquet of tools for variety of business needs, caused a massive shift in customer priorities and drove the adoption at an unimaginable speed.

Need for a BI Platform with comprehensive features.

With the rise in usage of visualisation tools in organisations that allow a business user to perform a very fancy analysis using GBs of data on their laptops, several critical aspects took hit. like data governance, data leakage, consistency at org level, weakening control over the org level KPIs definitions etc. These issues asked for a cloud based BI platform that allows the Organisation to control their KPIs and maintain consistency, establish governance framework to ascertain accountability and still give the business users the flexibility to perform their analysis using organisation data.

Enter Google Looker BI Platform.


Google recently added the enterprise grade Looker platform to their offerings and it is complimented by the Looker Studio (formerly Google Data Studio) which is more user-centric analytical tool.

What is Looker?

Looker is a cloud native BI Platform that offers a range of analytical, integration, API driven and governance related capabilities that enable visual data exploration, data driven workflows, custom data apps, analytical use cases and the requirements starting from an organisation level to a single business users. Below are the key features of Looker

LookML - The magic sauce of Looker which can write optimised SQL for more than 45+ databases. It also enables business to centrally define business logic in a version controlled environment. LookML code in view, models and explores help in organising the logic in the intuitive way.

Report Development: Looker offers Looker Explore UI for doing analysis and creating a Look that answers a business questions. User can create multiple looks and then combine them into a dashboard. Also users can use Looker Studio (formerly Google Data Studio) and connect to Looker Explore using connector and create reports using their familiar interface of Looker Studio.

Embedded: As the reports are developed in Looker, a user needs to login to Looker UI browse it. Looker's Embed feature provides a way to integrate the reports inside the existing web applications so that users can have a seam less experience by using the insights within the application they access regularly.

Looker APIs: APIs are the first class citizens in Looker ecosystem. They enable development of completely customised data applications and enable organisations to productize their data by packaging it in a custom data application and monetize it thru subscription model.

Looker Marketplace: Looker Marketplace offer a range of model blocks, charts, applications and actions that enable a fast track development, accelerated AI/ML development and connected experience between Looker and workforce tools. It also provides an ecosystem where different teams can create and publish useful components like charts, data models, custom applications etc. These can be easily used in Looker to provide an advanced and intuitive solution for business needs.

What is Looker Studio?

Looker studio is the renamed version of good old Data Studio. It has a very intuitive and user friendly interface which can be used to create highly customised and interactive reports and dashboards. As it is more user-centric and allows connecting to multiple datasource including local files, users can use the connector to connect with the Explores created in Looker and create reports and dashboards on top of the governed data layer of Looker along with connecting with un-governed/un-modelled data.


Based on the unique features of Looker architecture that contrast with the current breed of BI tools and allow creation of a organisation centric business data layer while also giving flexibility to individual business user to leverage the central data layer and build their own complex analysis on top of it is really the need of an hour.

#Looker #LookerStudio #googlecloud #DataAndAnalytics #GoogleLooker