#DataSessions Season 01

It’s a wrap! That’s 12 episodes, 24 sessions and 24 speakers in a year! The pandemic presented huge problems  for everyone and changed our way of life for the whole year. That includes community meetups. With the risks presented by COVID19 and social gathering essentially brought down to zero, in-person meetups had stop.  But as Vincent said in Collateral, “Now we gotta make the best of it, improvise, adapt to the environment, Darwin, shit happens, I Ching, whatever man, we gotta roll with it.”

That we did. We switched the monthly meetups for the Singapore SQLPASS to virtual. At first we were cautiously optimistic that the members would still attend. But they did. Going online also presented us with a new opportunity. It allowed folks outside of Singapore to attend as well as gave us a new unlimited source of quality speakers across the globe.

Check out the wealth of sessions and speakers we had last year. Let us know which one is your favorite. While you’re there, click the like button and subscribe too! We can’t wait to do it all again next year! 

#DataSessions: SQL Encryption 101

The past month I organized #DataSessions for Singapore SQLPASS group. Singapore SQLPASS is the Data Platform user group community in Singapore. For June, we held free live webinars every Tuesday and live streamed it in Facebook and YouTube. I delivered a couple of sessions about SQL Encryption, covering the available security features related to encryption in SQL Server – Transparent Data Encryption, Cell-Level Encryption, Always Encrypted and Secure Enclave. Check out the sessions below:

SQL Encryption 101: Transparent Data Encryption & Cell-Level Encryption

SQL Encryption 101: Always Encrypted (Secure Enclave)

If you want to watch more similar contents, you can check out all the Episodes here.

Weekly Roundup – Singapore User Group Community Events May 2020

We sure hope you are keeping safe and healthy during these days. In case you missed it or would like to revisit it, here’s what happened this week in our community. With the current situation, the events were delivered live virtually and has been uploaded in Youtube, refer to the links below:

May 12

SG SQLPASS Meetup May 2020Click here  to check out the whole Meetup or the links below for the individual sessions.
Power BI Paginated Reports  by Peter Myers – Self-Study KitPresentation 
Automated SQL Code Testing by Paul Ou Yang
Click here to subscribe to the SG SQLPASS Channel.

Dev Speak
This is a quarterly live stream for Developers & Tech Enthusiast in APAC. There were nine speakers from differnet countries. You can find the first episode here  or click the link below to jump to Ben’s session.
BlueTrace Protocol – Ben Ishyama-Levy 

May 14

Coming Up:
May 19-20

Microsoft Build 2020 – Build will be digital and free this year. If you haven’t registered yet, click here

Click here to subscribe to the Microsoft User Group Singapore Channel. We’ll be posting all the Singapore User group events to the channel. You can also find the individual channels of the Singapore MVPs and specific user groups on the channel section. Keep on learning while staying safe!may2020

Ten Tips When Working from Home

I have been working from home for 10 years now. That was when I was fortunate enough to join Pythian in 2010. Back then, it was not a popular work setup here in Asia, especially in the fast-paced work culture of Singapore where long hours and late nights are the norm. Working remotely definitely has its own perks and set of challenges. That in itself could be a whole article in itself, we’re not going to dig deep on that on this post. Right now a lot of countries have initiated a lockdown of some sort due to the COVID19 situation. Those companies and industries that can afford to let their employees work from home had started to so they can help flatten the curve and help keep their employees healthy as well as their loved ones. A lot of people had been asking me tips on how to work from home effectively so here’s my list of things to note when working from home.

  1. Have a dedicated workspace – Having a home office would be the best, but there are countries where having that or even a study room is not common. Having a workspace is important to be able to work effectively at home. Find your room or your corner to work at. That way, all your stuff that you need for work will be in one place and within reach. Let your family know that it will be your workspace. This bit is important especially for those that have kids at home to at the same time. It helps you concentrate on your work. Whether or not you have a door in your “workspace” let your family know when you are working. This way they will know where and when they are not allowed to be and not accidentally join your video conference call.
  2. Get a good table (and chair) – This is important especially if you are going to work for quite a while at home. The right table will be the difference in how long you can be sitting/standing there working and not have back pain later on. Some of my colleagues even invest in a good convertible sitting/standing table to keep you healthy and moving. Working on the sofa or on the bed sounds nice but believe me you won’t last a couple of hours without having to reposition yourself because it is aching somewhere.
  3. Set your Schedule – It is enticing to work on your own pace when at home. But when you do that, you are bound to procrastinate and you’ll end up ending your day unproductive. Set your work hours and stick to it if you want to be productive.
  4. Instant Messaging – One of the common questions I get right now from people is “How do you monitor your team’s work?” Communication is the key here. Have a tool to communicate instantly with your team apart from phone calls and emails. Have your team sign in and online on their instant messaging, be it slack, teams, skype or what have you. You don’t need to micro-manage and check on them every hour. Communication and trust are the keys here. Check-in once in a while, reach out and keep in touch with your team when you are working.
  5. Get good Headphones (and Mic) – You will be in web meetings a lot when you are working from home. A set of comfortable headphones is a must for long meetings. The same goes for having a decent microphone. You may get by with your built-in mic/speakers for a few meetings but when you are working from home and online meetings will be a handful, it won’t do. Also, mute when you are not talking. Forgetting this when you are in an office is forgivable but when you are at home always remember to mute and unmute yourself. You don’t need someone from across the globe to hear your dog barking.
  6. Change from your Pajamas – It is tempting, and I am guilty of this sometimes, to go straight to work when you just woke up especially when you left something to do the day before. But when I start my day like this, laziness often creeps. Since you are still in your PJs, you are not in the right headspace to start and go on working. Start your day as if you are going to your normal office to get that extra dose of energy. Also, you’ll never know when you will forget to turn off that camera when joining a conference call.
  7. Take breaks – Make sure to take breaks and avoid having your lunch or tea break in front of your desk. Stand up, move around. Avoid sitting down and working for eight hours straight.
  8. Backup internet connection – This applies even if you are in a place with high-speed internet like Singapore and a necessity when you are in a place with an unreliable connection. Here in Singapore, I have a 1 Gbps fibre connection but still, there are some times when I had to use my data connection as backup. When you are working from home, staying connected becomes all the more critical so having a fallback when your main connectivity doesn’t work is important.
  9. Keep distractions at a minimum – Notice I didn’t say avoid. When you are working from home, distraction is unavoidable. From your kids, pets, your game console, Netflix, social media and friends and family dropping by. There are plenty of things to keep your focus away from work. Keep them at a minimum. See number 1 and 3 above. Let the people at your home know your schedule so they’ll know when they can disturb you.
  10. Exercise – When you are working from home, the usual walk to the office or the rush when you commute is gone. If that is your only source of physical activity before getting some exercise is much more critical in keeping yourself healthy.


Working from home is not for everyone and it definitely takes some time to get used to. The most important thing is to do what you need to keep yourself focused and healthy. I’ll leave you with an old picture of the various workspace of folks at Pythian.


View this post on Instagram


An awesome perk of the #pythianlife? Working from home! ?? Check out some of our Pythian remote setups ? #workfromhome

A post shared by Pythian (@loveyourdata) on



Webinars | Singapore Data Camp 2020


March 14, 2020 – For the first time we held Singapore Data Camp 2020 online. We were scheduled to have a free community event for the Singapore Data Platform Community at Microsoft Singapore Office. A whole Saturday full of sessions about data platform technologies. We originally had two/three tracks of hour-long sessions to be delivered by local and international speakers. But then as we all are aware COVID19 happened and the responsible thing to do is to cancel any gathering and events of this scale. We didn’t want to leave the more than 200 people who registered to the event, so we had to pivot and convert the conference to a virtual event.
The speakers are all up for it and the attendees were even more eager for the event. Now that it is online, our ASEAN friends and neighbors can comfortably attend from wherever they are based. We even had attendees from the US. The sad part is since we won’t be able to manage multiple tracks effectively, we had to reduce the number of sessions. My fellow organizers had to scramble and figure out teams and how we’ll handle it.
All in all, the event was a success. A big thanks to our generous and patient speakers who adapt with us with the challenges of a virtual event. Also props to our community who were able to keep up to the changes and stay on the event for the whole day. I didn’t think the numbers of attendees will reach the numbers we had nor the reach. But people came and stayed. Going virtual even enabled us to record the sessions and share it to the members. I think we’ll likely have the monthly chapter meetups virtually for the time being. Silver-lining, members and can go back and consume the content on their own pace.

Here are the sessions from the Singapore Data Camp 2020:

Webinar | Azure SQL Database: Serverless & Hyperscale

I wrote a blog post on this subject last year when Azure SQL Database Serverless was first introduced. This year I was excited to present it for Microsoft Ignite: The Tour Singapore before it was canceled because of the growing crisis that would be COVID-19 pandemic. However, I was fortunate enough to be able to present it at SQL Saturday 922 – Wellington before more travel restrictions were put in place for various countries. I was also set to talk about it at the Philippine Data Day 2020, an event I helped organize, on 7th March at Microsoft Philippines. But by then the COVID-19 situation has grown at an alarming rate. I decided to cancel my travels to help mitigate the effects and spread of the virus.

The event went ahead as scheduled and was well attended with more than 130 in-person attendees. It was also live-streamed to 1.7K viewers. I hosted and presented virtually.

If you would like to read more about Azure SQL Database Serverless and Azure SQL Database Hyperscale, you can check out my blog post or you can watch my session below.

Fix: The Selected Subscriber does not satisfy the minimum version compatibility


You are trying to add a new subscriber to a SQL Server Publication using the SQL Server Management Studio and received an error similar to the one below:

“The selected Subscriber does not satisfy the minimum version compatibility level of the selected subscriber.”


You are receiving this error because the SQL Server version of your Publisher and Subscriber SQL Sever instance is not the same. By default replication backward compatibility  for subscribers should be within 2 versions above or below your  publisher. For the complete information about Replication Backward compatibility please click here to find out more. The wizard checks this compatibility and may incorrectly throws out an error if the versions are different.


Before proceeding, kindly make sure that the versions of your publisher and subscriber falls under the compatibility matrix on the Microsoft Documentation. If they don’t fall within the matrix, the pair is unsupported. You may be able to add them but any issues about the replication would be covered by support. The version check can be overriden if you add the subscriber using TSQL:

DECLARE @publication AS SYSNAME;

SET @publication = N’PublicationName’;
SET @subscriber = ‘SubscriberInstanceName’;
SET @subscriptionDB = N’SubscriberDatabase’;

–Add a push subscription to a transactional publication.
USE [PublisherDatabase]
EXEC Sp_addsubscription
@publication = @publication,
@subscriber = @subscriber,
@destination_db = @subscriptionDB,
@subscription_type = N’push’;

–Add an agent job to synchronize the push subscription.
EXEC Sp_addpushsubscription_agent
@publication = @publication,
@subscriber = @subscriber,
@subscriber_db = @subscriptionDB

For further guidance on the sp_addsubscription command, check out the documentation here.


Azure SQL Database Serverless and Hyperscale

Last May the Build 2019 Conference happened in Seattle. A number of new features and capabilities were announced and showed. This week local communities in the Asia Pacific held their own version of Build bringing the best and highlights from the conference to the local user communities. The even involved Live Webinars and in-person events. I delivered the updates for the SQL here in Singapore and here are the notable highlights:

Azure SQL Database Serverless

Serverless is a new pricing tier for single Azure SQL Database under the vCore model in the Gen5 compute. It is well suited for intermittent workloads that can afford warm period for their resources. While checking this feature it reminds me a lot of databases with the Auto_Close parameter on. Compared to the other pricing tier it is priced a bit higher but it is billed per second instead of per hour and you are only charged for the compute that you actually use – which still gives you the best value for your money. You can set a minimum and maximum value for your core, 0.5 to1 and 1 ot 4 respectively. Then set an Auto-Pause delay, current lowest value is six hours.

Now what will happen is your database will start with the minimum vCore that you set and it will scale up as the workload comes and requires. If you have new application that you are not certain how much resources it will require this is a good place to test run it and figure out how much you really need as the scaling is automatic.  Now when there are no activity on your database till the auto-pause delay you have set – activity can be any connection or any configuration changes, your database will be paused. During that time the memory is reclaimed and the cpu is released.  From then on you are not charged for compute till a new activity happens to your database. Something to highlight – Since the database is paused and the resources are released, the first connection to your database when it is in the pause state will fail and throw an error, but succeeding connection will be successful. The first connection is the event that will signal the database to resume. So you’ll need to either issue a resume first before attempting a connection or add a connection retry logic on your application, which is actually a best practice for SQL Azure Database. During my session someone asked if this’ll change in the future – resume the database instead of throwing an error. That I’m not sure.

After resuming it will then start with the minimum resources that you have set and then slowly scale up as your workload goes up. I can see a lot of scenarios that’ll benefit from this, Dev and test environment comes to mind. As well as all those third-party applications I’ve seen before with the auto_close parameter set to true. For more information you can check it out more here.

Serverless Vs Provisioned Compute
For Serverless you are only billed for the CPU that you used per second.


Azure SQL Database Hyperscale

At the other end of the spectrum is something for the VLDB out there – Hyperscale. The pain for any large databases is doing any maintenance and operational tasks on it, that includes adding compute and storage and any backup and restore. This actually blew my mind while checking it out. Hyperscale more or less re-architected how SQL server works with its storage and logs to provide high performance and easier storage operations for databases more than 4 TB in sizes. The Engine is de-coupled from the storage so that scaling for both compute and storage is faster and backup and restore operation less painful. The promise is the backup and restore operation will take almost the same time whether your database is 4 TB or 100 TB. In the demo a 50 TB database was restored to a new DB in less than 8 minutes. Which is impressive. Hyperscale gives users a new pricing tier that you can scale out beyond the current limit of any of the offerings, 4 TB, to 100 TB and more. In theory you can go beyond the 100 TB if you have more than that.

The architecture puts your engine in one compute with a lot of local SSD to use for your resilient buffer pool extension (RBEx) where your most recently accessed data will live much like the buffer cache in the traditional SQL Server. This guarantees less tham 0.5 ms of data access latency. Then your data files are distributed in 128 GB chunks to what it calls as a Page Servers with remote SSD – it is akin to the SAN storage architecture that we have on prem where the storage for the SQL Server are in a remote SAN boxes. Those remote SSD will be used as RBEx to hold your entire 128 GB of data guaranteeing less than 2.5 ms of data access latency. All the while the actual data files will reside on an Azure Standard storage attached to the Page servers.  This is also how it can do backup and restore operation very fast. The backup & restore operation are done using file snapshots of the data files which are isolated from the compute.

It also changed the way SQL performs its log operation. You can discover and read more about it here. Although with its architecture, it is not likely to come to SQL Server product as it is, it is after all developed for the cloud. I’m wondering how this will affect the next iteration of SQL Server.

Hyperscale Architecture
Hyperscale Architecture


Worth a Mention:

Build 2019 had a lot of new features announced so it takes time to digest everything. But burried somewhere there are the following new SQL-related new features that are worth mentioning:

1. Azure SQL Database Edge – a small database engine with time-series and data streaming with ML capabilities for your edge devices.

2. CosmosDB now offers APIs for Spark and etcd.

3. CosmosDB now provides Notebook experience for all APIs

4. You can now choose collation for Azure SQL Database Managed Instance

5. You can now choose timezone for Managed Instance instead of being stuck with UTC

6. Managed Instance now has Public Endpoint that you can grant access to either Azure Services or Internet.

For the entire list of Updates from Build you can visit this page.

Five Ways to Migrate Your On-Premise SQL Database to Azure

There are currently Three Deployment options for SQL Server Database on the cloud, well four if you count Azure SQL Databases in Elastic Pool. But Azure SQL DB in Elastic Pool are essentially just Azure SQL Databases grouped into an Elastic Pool so you can maximize what you’re paying for and share the resources. Anyway, I digress. Right now, you can either put your SQL Server database in a SQL Server in an Azure VM, in a Managed Instance, or in a an Azure SQL Database. Now there are different considerations for choosing one over the other but that’s a whole different blog post to be able to address them. On this post we are going to focus on how you can move your on premise SQL Server database to Azure.  Here’s a neat table comparing the possible ways for the three:

SQL Server in Azure VM Managed Instance Azure SQL Database (Singleton/Elastic Pool)
Backup & Restore Yes Yes No
Dettach/Attach Yes No No
Mirroring/Availability Groups Yes No No
Log Shipping Yes No No
Replication Yes Yes Yes
DMA/DMS Yes Yes Yes
Bulk Load Yes Yes Yes

Most every strategy you have for migrating SQL Server on premise will work for SQL Server in Azure VM. This is the most common way of migrating to the cloud. It is easy since it is essentially just moving your old server to a new server.  Managed Instance also allows you to restore Native sql server backups from on-premise SQL Server, so it is fairly compatible and moving is more or less smooth. The challenge actually comes for moving your existing SQL Server Databases to Azure SQL Database. Due to security reasons and infrastructure differences you cannot restore a SQL Server backup to Azure. So this often leave a lot of people scratching their head on how to move their existing databases to Azure SQL Database. Here are Five Ways to migrate your on-premise SQL Database to Azure:

  1. Data Migration Assistant/ Data Migration Service

    This is the service/tool that Azure specifically created for this task.  The tool can also perform Assessment to identify feature parity and Database Compatibility issues from moving your SQL Server to any of the available Deployment options for SQL Server from on-premise SQL Server to Azure SQL Database. It will present you with a report identifying all those, as well as the possible fix for some of it for the different Database Compatibility level available for your chosen target. For the migration,  DMA/DMS has offline and online options. Offline allows you to either migrate just the schema or the schema and data in one shot. You would likely use this method for testing/POC, for small databases or those applications that can afford longer downtime while you move the data. Online option, allows you to keep the on premise database and the Azure SQL Database to continuously synchronize after you have moved the initial schema/data. This way you can have ample time for the Azure SQL Database to get caught up and then decide when to cut-over and start pointing your application to the Azure SQL Database.  Microsoft’s Doc has a detailed step by step on how to do both offline and online migration to Azure SQL Database.

  2. Replication

    You can add an Azure SQL Database as a subscriber to a Transactional Replication. This can let you migrate the data and objects from your on premise SQL  Server to Azure SQL Database. Since you cannot restore backups, you can only initiate this replication from snapshot. Similar to online migration for One thing to note, you can only replicate tables with Primary Key so if you have a number of tables without them you’ll need to get creative or pair it with other techniques to move those tables.


    Dacpac is a single file containing the definitions for your Database schema –  tables, view, stored procedures,  functions, and other database objects. Sort of like a schema only backup. You can create and restore one to Azure SQL Database using SSDT, SSMS or SQLpackage.exe. Since this only migrates your Schema, you’ll need to pair it with other techniques to bring your data to Azure SQL DB. You’d likely use this for POC/test or for New Applications that does not require any data to migrate. You can create a dacpac file and deploy one using SSDT, sqlpackage.exe or SSMS.


    BACPAC is similar to DACPAC but encapsulates both Schema and Data.  So you can bring your entire database to Azure SQL Database. This single file stores the information in json format. This strategy, is applicable to small databases. You can create a bacpac file and import one using SSDT, sqlpackage.exe or SSMS.

  5. Generate SQL Scripts

    Sql Server Management Studio has a Generate SQL Script UI. You can right click a database-> Tasks -> Generate SQL Scripts. If you’ve been working with SQL Server, you’ve probably used this before. What you may have not noticed before is when you click on Advanced, you can actually control what the script provides you. Specifically, you can change the Engined Type from Stand-Alone or Azure SQL Database. So the script provided is compatible to it. You can also actually specify on Type of Data to Script whether to output schema Only or Schema and Data.  So for what its worth, it is your TSQL version of DACPAC or BACPAC. The advantage on this is you can add/modify the resulting TSQL script as you need to. Similar to BACPAC, if you are scripting the Data, it is applicable for small databases but not to bigger databases. If you review the resulting script, it is actually writing your dat as individual insert commands per row.

Azure provides a very helpful tool for planning your migration, Azure Database Migration Guide. You can change the source and destination and it will provide you with the guide for the Pre-migration, Migration, and Post-Migration tasks that you need to be aware of as well as links to the resources and tools for each step. If you are planning a migration be sure to check the page before hand to assist you in your planning.


Fix: SQL Backup to URL Error – The remote server returned an error: (400) Bad Request


You are trying to take a native SQL Backup to URL and you received the error below:

Msg 3271, Level 16, State 1, Line 1
A nonrecoverable I/O error occurred on file “https://MyStorageAccount.file.core.windows.net/MyContainer/Adventureworks_full_20190420.bak:”
Backup to URL received an exception from the remote endpoint. Exception Message: The remote server returned an error: (400) Bad Request..
Msg 3013, Level 16, State 1, Line 1
BACKUP DATABASE is terminating abnormally.



You are trying to save a blob file using to a Non-Blob Storage or not using the Blob Storage Endpoint. Check your URL if you are pointing to the correct one. SQL Server backups can only be saved to a Blob Storage. If you are using a General Purpose Storage, you will have multiple end points for different usage. You should be using the blob url for SQL backups. You can see from the picture below. The one in the green box is the Blob Storage endpoint. Using any other endpoints would result to a Bad Request Error. Same is true for using the other endpoints. You should be using it for the correct file types.



Review the url path you are trying to restore to and make sure you are using the correct end point for Blob Storage.

Singapore SQL PASS, PHISSUG& Azure Global Boot Camp

It’s going to be a busy week. I got the opportunity to speak about SQL Server and Azure this week for three different events.

I’ll be talking about Migrating SQL Server to Azure on Thursday, April 25, 2019 for the Singapore SQL Pass April 2019 Meetup. Register here if you want to attend. The first session is about Live Data Stream for Power BI so both sessions are very interesting.

Then on Friday night, April 26, 2019, I’ll be talking about the Deployment Options for SQL Server and when to use them for the Philippine SQL Server User Group at Microsoft Philippines. With the advent of cloud, we have a number of ways to use SQL Server. I’ll be comparing the current available options for hosting your SQL Server database and provide some guidance on how to decide which one to choose.  It’s gonna be an “all about cloud” night since the other session will be discussing how to move your ETL workloads into the cloud.  It’s been a while since PHISSUG had a meetup so it will be awesome. If you’re in Manila, you can register here to join the event.

Then lastly, the big one would be speaking at Azure Global Boot Camp 2019 at Microsoft Philippines. I’ll be talking about Migrating SQL Server to Azure and will be showing multiple ways of moving your on premise SQL database to Azure. It’s a whole day of Azure learning delivered by various experts and MVPs.

Now here’s hoping the “Demo Gods” will bless me and have a great session on all three.