A little diversion from #NoConsentNoTracking to talk about #GA4 and #LookerStudio.
There was already a good level of amazement and frustration with how #Google is handling the forced rollout of GA4.
Now, watching discussions about LookerStudio's native connector for GA4 consuming API calls and burning quotas like kids grabbing candy on Halloween after a two-year pandemic will have lasting effects and hurt for a long time. ...
When you see your most loyal users and long-time advocates publicly expressing their frustration, any decent solution provider would listen and try to do damage control.
But this is Google.
It's worth nothing there were some quotas with #DataStudio and #GoogleAnalytics, but I don't remember ever seeing large-scale frustration with quotas. Although announced some time ago, quotas for GA4 were not enforced until recently.
In case you didn't know, when editing a dashboard, any change triggers a considerable number of GA4 API calls, which are counted against the hourly (1250) and daily (25,000) quotas. As James Standen said in Measure Slack, “Every refresh of every element consumes some tokens, once you have used up 1250 tokens, then you have to wait until the top of the hour (3:00, 4:00 etc,) and you get another 1250 tokens. Editing is particularly bad since every dim or metric you add causes a refresh. If you go over 25,000 tokens a day (from all tools not just LS) then it is locked out and refreshes at midnight PST”
In other words, anything that depends on the GA4 API could stop working at any time during the day - any dashboards or automation leveraging GA4 data!
There are solutions: Use the #BigQuery connector (doesn't change the fact that there will be many calls because of the way LookerStudio works and potentially result in additional costs). Use Supermetrics or Analytics Canvas by nModal Solutions Inc., which also act as a decent ETL tool to prep the data before you use it. Or consider alternatives to GA4 and LookerStudio...
#digitalanalytics