Skip to main content

Deadlocks in SQL Azure V12

Hi All
 
Today I want to explain how to find and troubleshoot Deadlocks in SQL Azure V12. 
 
At V2 we had this query to run and find our deadlocks:
 
SELECT * FROM sys.event_log WHERE event_type = 'deadlock'
 
This query return XML, and we could have change it to XDL and see the deadlock Chart.
 
You can see this in the 2 links by Thomas Larock (@SQLRockstar):
 
 
&
 
Now in V12 this feature is not supported - so how do you get the deadlock data?
 
MSFT Gave us this query (Running on Master DB):
 
 
SELECT top 100 *,
CAST(event_data as XML).value('(/event/@timestamp)[1]', 'datetime2') AS timestamp
,CAST(event_data as XML).value('(/event/data[@name="error"]/value)[1]', 'INT') AS error
,CAST(event_data as XML).value('(/event/data[@name="state"]/value)[1]', 'INT') AS state
,CAST(event_data as XML).value('(/event/data[@name="is_success"]/value)[1]', 'bit') AS is_success
,CAST(event_data as XML).value('(/event/data[@name="database_name"]/value)[1]', 'sysname') AS database_name, CAST(event_data as XML)
FROM sys.fn_xe_telemetry_blob_target_read_file('el', null, null, null)
where object_name = 'database_xml_deadlock_report'
order by CAST(event_data as XML).value('(/event/@timestamp)[1]', 'datetime2') desc

 

 
We get XML and can use it or its graphical view (save as xdl).
 
This is written in some delay.
 
This is new  and Un documented fn, that reads the event log from BLOB (as I understand), so this is actually the first look of EE in SQL Azure.
 
Enjoy and Thanks to Geri Resef, helped me with this.
 
Pini
 
 

Comments

Popular posts from this blog

How to restore deleted Azure Synapse dedicated SQL pool

  Existing dedicated pool can be easily restored from Azure portal or PowerShell command, but for now deleted pool could be restored from PowerShell only! Example: # Connect to Azure with system-assigned managed identity $AzureContext = (Connect-AzAccount -Identity).context # set and store context $AzureContext = Set-AzContext -SubscriptionName $AzureContext.Subscription -DefaultProfile $AzureContext # $AzureContext = Set-AzContext -SubscriptionName $SubscriptionName -DefaultProfile $AzureContext $SubscriptionName="Databases" $ResourceGroupName="stg-rg-we" $ServerName="stg-synapse-we"   $DatabaseName="sql_we_2023_11_07_13_42" $NewDatabaseName="sql_dp_we_deleted" ######################################## $token = (Get-AzAccessToken -ResourceUrl https://database.windows.net).Token $SubscriptionId = "ce088f9e-1111111a3914b" $DedicatedPoolEndPoint = "stg-synapse-we.sql.azuresynapse.net" $DedicatedPoolName = $DatabaseNam...

The journey to the Lakehouse

A long time has passed since the last post, we have gone through a long and tedious journey to adapt what Azure offers us, to our needs. Our needs were simple, the Current Datawarehouse (SQL Server on VM inazure) served the BI. ML teams worked on GCP, we want to let both teams to work on Azure in a platform that will have the ability to scale and will not fail every 2 days. We checked: Snowflake on azure Synapse analytics GCP We decided to go for the full Azure product for the reasons: Migration time support costs Synapse as a platform contains many components, and the challenge was to find what fits  us as an organization and as a group. The knowledge of the people and their abilities influenced the plans. Here's what we planned and what we did: We start to put everything in the Data Lake in parquet or delta format, build on top of Azure ADLS gen 2. We had to move some data to T-SQL compatible platform, so this involves setting up a dedicated Synapse pool , which is a fully man...

Azure SQL DB tiers comparison

Hi All In the last few month Brent Ozar gae us 2 masterpiece blogs related to Azure SQL DB:   How fast can a $21,468/mo Azure SQL DB load data?     In this blog Brent compared the abilities of Azure SQL DBs to load Data - he compared all combinations of vCors tiers. (When I asked him about comparing the Standard\Premium tiers, he told me to do it.... :-) )   There’s a bottleneck in Azure SQL DB storage throughput.   In this blog Brent showed us that in the vCors world the storage throughput has limit and there is not need to pay so much money when you need to upload lots of data.   So I took have taken up his challenge and done a comparison in Azure SQL DB in Standard\Premium tiers. I have created a new DB with 1 Table. I have generated 7 GB of DATA, and created the file in my local on premise drive (Yes, do not kill me, I did not had the time to put it on azure), and uploaded it via BCP command.   bcp "TableNam...