Skip to main content

Dashboards in Azure Portal

Hi
 
Today I will be reviewing the Azure Portal Dashboard.
We have seen lots of improvements in that field.
It has developed to include a variety of options, and lot of use cases:
 
  1. Monitoring, with full screen and charts.
  2. Shortcuts to the most usable apps.
  3. Share dashboards between users.
  4. You can have multi Dashboards, like Dashboard for DB's, for Storage or VM's or per application or resource group.
I find these new features very useful and very easy to implement.
Here are some screenshots showing the uses.
 
Image 1 - shows what options we have for the Dashboards
 
 
:
We can add new, edit an exiting one, share to other users, clone and delete.
 
 
Image 2 - shows the options when clicking on the arrow near the Dashboard, we see the list of my dashboards, and the dashboards that was shared with me.
 
 
 
Image 3 - shows the screen after clicking on "Share", we can share to a subscription, and put it in a location as resource group.
 
 
 
 
Image 4 - shows a custom Dashboard I created for a 10 sharded DB system - to have the DTU and Storage in the same place - also adding the management DB.
 
  
 
So enjoy your journey with in the new Dashboards world.
 
Thanks
Pini 
 

Comments

Popular posts from this blog

How to restore deleted Azure Synapse dedicated SQL pool

  Existing dedicated pool can be easily restored from Azure portal or PowerShell command, but for now deleted pool could be restored from PowerShell only! Example: # Connect to Azure with system-assigned managed identity $AzureContext = (Connect-AzAccount -Identity).context # set and store context $AzureContext = Set-AzContext -SubscriptionName $AzureContext.Subscription -DefaultProfile $AzureContext # $AzureContext = Set-AzContext -SubscriptionName $SubscriptionName -DefaultProfile $AzureContext $SubscriptionName="Databases" $ResourceGroupName="stg-rg-we" $ServerName="stg-synapse-we"   $DatabaseName="sql_we_2023_11_07_13_42" $NewDatabaseName="sql_dp_we_deleted" ######################################## $token = (Get-AzAccessToken -ResourceUrl https://database.windows.net).Token $SubscriptionId = "ce088f9e-1111111a3914b" $DedicatedPoolEndPoint = "stg-synapse-we.sql.azuresynapse.net" $DedicatedPoolName = $DatabaseNam...

The journey to the Lakehouse

A long time has passed since the last post, we have gone through a long and tedious journey to adapt what Azure offers us, to our needs. Our needs were simple, the Current Datawarehouse (SQL Server on VM inazure) served the BI. ML teams worked on GCP, we want to let both teams to work on Azure in a platform that will have the ability to scale and will not fail every 2 days. We checked: Snowflake on azure Synapse analytics GCP We decided to go for the full Azure product for the reasons: Migration time support costs Synapse as a platform contains many components, and the challenge was to find what fits  us as an organization and as a group. The knowledge of the people and their abilities influenced the plans. Here's what we planned and what we did: We start to put everything in the Data Lake in parquet or delta format, build on top of Azure ADLS gen 2. We had to move some data to T-SQL compatible platform, so this involves setting up a dedicated Synapse pool , which is a fully man...

Azure SQL DB tiers comparison

Hi All In the last few month Brent Ozar gae us 2 masterpiece blogs related to Azure SQL DB:   How fast can a $21,468/mo Azure SQL DB load data?     In this blog Brent compared the abilities of Azure SQL DBs to load Data - he compared all combinations of vCors tiers. (When I asked him about comparing the Standard\Premium tiers, he told me to do it.... :-) )   There’s a bottleneck in Azure SQL DB storage throughput.   In this blog Brent showed us that in the vCors world the storage throughput has limit and there is not need to pay so much money when you need to upload lots of data.   So I took have taken up his challenge and done a comparison in Azure SQL DB in Standard\Premium tiers. I have created a new DB with 1 Table. I have generated 7 GB of DATA, and created the file in my local on premise drive (Yes, do not kill me, I did not had the time to put it on azure), and uploaded it via BCP command.   bcp "TableNam...