Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Tech News - Databases

233 Articles
Anonymous
08 Nov 2020
1 min read
Save for later

Monitoring and Tracking SQL Server Deadlock process from Blog Posts - SQLServerCentral

Anonymous
08 Nov 2020
1 min read
Introduction SQL Server deadlock one of the issues that can be happened in any SQL Server, today in this article I will not explain what is Deadlock and How we can solve it but the main purpose of this article is sharing the scripts I am using it for monitoring this kind of process let … Continue reading Monitoring and Tracking SQL Server Deadlock process The post Monitoring and Tracking SQL Server Deadlock process appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 650

Anonymous
07 Nov 2020
2 min read
Save for later

What’s Your Vision for PASS? from Blog Posts - SQLServerCentral

Anonymous
07 Nov 2020
2 min read
The PASS election slate was released for 2020. The candidates and their statements are:: Hamish Watson (@thehybriddba) Joey D’Antoni (@jdanton) – Community Matters–Why I’m Running for the PASS Board of Directors Jose Rivera –  (@SQLConqueror) – Lori Edwards (@loriedwards)- Matt Gordon (@sqlatspeed) – Why I Am Running for PASS Board Robert Fonseca (@roberto_mct) – Steph Locke (@TheStephLocke) – I’m Running for the PASS Board Not everyone has a statement, though you can read their bios and applications on the PASS Site. PASS is at a turning point in the organization, with the pandemic and the move of their main fundraiser to a virtual event. Across the last two years, I’ve been a member of PASS, a supporter, and a critic with my own thoughts and ideas about the organization. However, I’m not sure where PASS should go. There is a lot of enthusiasm for the organization from some, and no shortage of criticism. I see both sides, but what I’m not sure about is what I want PASS to be or do in the future. I suspect the organization will change, but into what? I’d ask that each of the candidates outline a vision for what PASS should be. Not tactical specifics or complaints, but ideally, what do we want from the organization. What would represent the members and be a great community group? Hopefully we’ll see something before Wednesday, when voting starts. The post What’s Your Vision for PASS? appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 591

Anonymous
06 Nov 2020
1 min read
Save for later

Daily Coping 6 Nov 2020 from Blog Posts - SQLServerCentral

Anonymous
06 Nov 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to get outside and observe the changes in nature around you. I was sick recently. Worn out, tired, and took a few days off. During that time, I was inside, didn’t go out to the store, and really didn’t work very much. I didn’t really go outside for a couple days, which unusual for me. When I felt better, I told my wife I’d help her get a few things done. Walking outside, feeling the change in weather, which has been dramatic a few days this fall in Denver, and seeing animals, was good for the soul. I even took a little video and enjoyed a moment with baby Phoebe. This make most things better The post Daily Coping 6 Nov 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 582
Banner background image

article-image-solved-sql-backup-detected-corruption-in-the-database-log-from-blog-posts-sqlservercentral
Anonymous
06 Nov 2020
5 min read
Save for later

[Solved] SQL Backup Detected Corruption in the Database Log from Blog Posts - SQLServerCentral

Anonymous
06 Nov 2020
5 min read
Summary: In this article, we will discuss about the ‘SQL Backup Detected Corruption in the Database Log’ error. It will also describe the reason behind the error and manual workarounds to resolve it. The article also explains an alternative solution that can be used to restore the database and its transaction log backup – when the manual solutions fail.  When performing transaction log backup for a SQL database, to restore the database after network maintenance or in the event of a crash, you may find the backup job failed with the following error: Backup failed for Server xxx (Microsoft.SqlServer.SmoExtended) System.Data.SqlClient.SqlError: BACKUP detected corruption in the database log. Check the errorlog for more information. (Microsoft.SqlServer.Smo) The error message clearly indicates that the transaction log is damaged (corrupted). Checking the SQL errorlog for more details on the error shows: 2020-11-01 13:30:40.570 spid62 Backup detected log corruption in database TestDB. Context is Bad Middle Sector. LogFile: 2 ‘D:DataTestDB_log.ldf’ VLF SeqNo: x280d VLFBase: x10df10000 LogBlockOffset: x10efa1000 SectorStatus: 2 LogBlock.StartLsn.SeqNo: x280d LogBlock.StartLsn.2020-11-01 13:30:40.650 Backup Error: 3041, Severity: 16, State: 1.2020-11-01 13:30:40.650 Backup BACKUP failed to complete the command BACKUP DATABASE TestDB. Check the backup application log for detailed messages However, the FULL database backup completed successfully and even running DBCC CHECKDB integrity check didn’t find any errors. What Could Have Caused the SQL Transaction Log Backup to Fail? A transaction log (T-log) backup allows restoring a database to a certain point-in-time, before the failure occurred. It does so by taking a backup of all the transaction logs created since the last log backup, including the corrupt portion of the T-log. This causes the backup to fail. However, a FULL database backup only has to back up the beginning of the last active part of the T-log – at the time the backup is taken. Also, DBCC CHECKDB requires the same amount of log as the FULL database backup – at the time of the db snapshot was generated. This is why the full backup executed successfully and no errors were reported by DBCC CHECKDB. Manual Workarounds to Backup Detected Log Corruption in SQL Database Following are the manual workarounds you can apply to resolve the SQL backup log corruption issue: Workaround 1: Change the SQL Recovery Model from FULL to SIMPLE To fix the ‘SQL Server backup detected corruption in the database log’ issue, try switching the database to the SIMPLE recovery model. Switching to SIMPLE recovery model will ignore the corrupted portion of the T-log. Subsequently, change the recovery model back to FULL and execute the backups again. Here’s the steps you need to perform to change the recovery model: Step 1: Make sure there are no active users by stopping all user activity in the db. Step 2: Change the db from FULL to a SIMPLE recovery model. To do so, follow these steps: Open SQL Server Management Studio (SSMS) and connect to an instance of the SQL Server database engine. From Object Explorer, expand the server tree by clicking the server name. Next, depending on the db you are using, select a ‘user database’ or choose a ‘system database’ by expanding System Databases. Right-click the selected db, and then select Properties. In the Database Properties dialog box, click Options under ‘Select a page’. Choose the Simple recovery model from the ‘Recovery model’ list box, and then click OK Step 3: Now set the db back to the FULL recovery model by following the same steps from 1 till 5 above. Then, select Full as your recovery model from the list box. Step 4: Perform a FULL database backup again. Step 5: Take log backups again. Hopefully, performing these steps will help you perform the transaction log backup without any issue. Note: This solution won’t be feasible if you’re using database mirroring for the database for which you have encountered the ‘backup detected log corruption’ error. That’s because, in order to switch to the SIMPLE recovery model you will need to break the mirror and then reconfigure the db which can take significant amount of time and effort. In this case, try the next workaround. Workaround 2: Create Transaction Log Backup using Continue on Error Option To complete executing the backup of T-log without any error, try running log backup of SQL database with the CONTINUE AFTER ERROR option. You can either choose to run the option directly from SSMS or by executing a T-SQL script. Steps to run the ‘Continue on Error’ option from SSMS are as follows: Step 1: Run SSMS as an administrator. Step 2: From ‘Back Up Database’ window, click Options under ‘Select a page’ on the left panel. Then, select the ‘Continue on error’ checkbox under the Reliabilitysection.  Step 3: Click OK. Now, run the log backup to check if starts without the backup detecting an error in SQL database. Ending Note The above-discussed manual solutions won’t work if the transaction log is missing or damaged, putting the database in suspect mode. In that case, you can try restoring the database from backups or run Emergency-mode repair to recover the db from suspect mode. However, none of the above solutions might work in case of severe database corruption in SQL Server. Also, implementing the ‘Emergency-mode repair’ method involves data loss risk. But, using a specialized SQL database repair software such as Stellar Repair for MS SQL can help you repair a severely corrupted database and restore it back to its original state in just a few steps. The software helps in repairing both SQL database MDF and NDF files. Once the MDF file is repaired, you can create a transaction log file of the database and back it up without any encountering any error. www.PracticalSqlDba.com The post [Solved] SQL Backup Detected Corruption in the Database Log appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1235

article-image-microsoft-power-bi-quick-start-guide-second-edition-from-blog-posts-sqlservercentral
Anonymous
06 Nov 2020
1 min read
Save for later

Microsoft Power BI Quick Start Guide – Second Edition from Blog Posts - SQLServerCentral

Anonymous
06 Nov 2020
1 min read
Earlier this week I announced the release of a new Power Platform book. While I’m super excited about that book I’m also excited to announce the 2nd edition of the Power BI Quick Start Guide. As you know Power BI is in a constant state of change so this second edition is not only an update but also introduces new topics like Power BI dataflows and several of the new AI features that have been introduced into Power BI. I hope you enjoy this new book, which you can find here! The post Microsoft Power BI Quick Start Guide – Second Edition appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 761

Anonymous
05 Nov 2020
3 min read
Save for later

Adding a user to an Azure SQL DB from Blog Posts - SQLServerCentral

Anonymous
05 Nov 2020
3 min read
Creating a user is simple right? Yes and no. First of all, at least in SSMS it appears you don’t have a GUI. I don’t use the GUI often unless I’m working on a T-SQL command I haven’t used much before but this could be major shock for some people. I right clicked on Security under the database and went to New -> User and a new query window opened up with the following: -- ======================================================================================== -- Create User as DBO template for Azure SQL Database and Azure SQL Data Warehouse Database -- ======================================================================================== -- For login <login_name, sysname, login_name>, create a user in the database CREATE USER <user_name, sysname, user_name> FOR LOGIN <login_name, sysname, login_name> WITH DEFAULT_SCHEMA = <default_schema, sysname, dbo> GO -- Add user to the database owner role EXEC sp_addrolemember N'db_owner', N'<user_name, sysname, user_name>' GO Awesome! I did say I preferred code didn’t I? I am noticing a slight problem though. I don’t actually have a login yet. So I look in object explorer and there is no instance level security tab. On top of that when I try to create a login with code I get the following error: Msg 5001, Level 16, State 2, Line 1User must be in the master database. Well, ok. That’s at least a pretty useful error. When I connect to the master database in SSMS (remember, you can only connect to one database at a time in Azure SQL DB) I do see security tab for the instance level and get the option to create a new login. Still script but that’s fine. -- ====================================================================================== -- Create SQL Login template for Azure SQL Database and Azure SQL Data Warehouse Database -- ====================================================================================== CREATE LOGIN <SQL_login_name, sysname, login_name> WITH PASSWORD = '<password, sysname, Change_Password>' GO So in the end you just need to create your login in master and your user in your user database. But do you really need to create a login? No, in fact you don’t. Azure SQL DBs act like partially contained databases when it comes to users. I.e. if you one of these commands you can create a user that does not require a login and authenticates through the database. CREATE USER Test WITH PASSWORD = '123abc*#$' -- SQL Server ID CREATE USER Test FROM EXTERNAL PROVIDER -- Uses AAD That said I still recommend using a login in master. You can still specify the SID and that means that if you are using a SQL Id (SQL holds the password) you can create a new DB and associate it to the same login without knowing the password. The post Adding a user to an Azure SQL DB appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1160
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at €14.99/month. Cancel anytime
Anonymous
05 Nov 2020
1 min read
Save for later

Daily Coping 5 Nov 2020 from Blog Posts - SQLServerCentral

Anonymous
05 Nov 2020
1 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to set a goal that links to your sense of purpose in life. My purpose in life is to help others. I do this in a few ways, but one of them that I’ve been wanting to tackle is to find a new way to volunteer locally. Today, I’m setting a reminder to take my volunteer day from Redgate this year and spend it with a local group. The post Daily Coping 5 Nov 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 540

article-image-redgate-sql-data-masker-refreshing-schema-from-blog-posts-sqlservercentral
Anonymous
04 Nov 2020
2 min read
Save for later

Redgate SQL Data Masker Refreshing Schema from Blog Posts - SQLServerCentral

Anonymous
04 Nov 2020
2 min read
This is a quick blog to help me remember what is going on with the Data Masker product. This is for the SQL Server version, but I believe the Oracle one is very similar. I added a new column to a table, and I had a masking plan already built. How do I get my masking plan to show the new column? Here is my masking plan: I added a new column to the DCCheck table, which is under rule 01-0026. If I open that mask and add a new column, I get this, but I can’t expand the dropdown. All the columns in this table are masked, and data masker doesn’t know about the new one. I need an updated schema, as the rules do not update in real time. To get this to work, I need to return to the masking plan and double click the controller at the top. This is the schema manager for my set of rules. Note: If I mask different schemas, I need different controllers. Once this opens, I can see my connection to a database. In my case, I’m building this in dev areas, so it’s pointed to the QA environment. If I click the “Tools” tab at the top, I see lots of options, one of which is to refresh. Once I pick that one, I have a bunch of more options, which gets confusing, but I can click the “refresh all tables” at the top, leaving everything alone. Once that’s done, I get a note. Once I get this, I can return to my rule, and when I add a new column, and I see it listed. This isn’t the smoothest flow, but data masker isn’t something that is likely to be in constant use. For many of us, adding new schema items is relatively rare, so we can adjust our plans as needed. The one good thing is that I can easily find where I need to add a column, as opposed to digging through a number of .SQL scripts. The post Redgate SQL Data Masker Refreshing Schema appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 705

Anonymous
04 Nov 2020
1 min read
Save for later

T-SQL Tuesday Retrospective #003: Relationships from Blog Posts - SQLServerCentral

Anonymous
04 Nov 2020
1 min read
In my quest to respond to every T-SQL Tuesday since the dawn of the end of 2009, it was only a matter of time before Rob Farley’s name came up. I first met Rob at his 40th birthday party, many (many!) years ago at the PASS Summit. He of course has no recollection of this-> Continue reading T-SQL Tuesday Retrospective #003: Relationships The post T-SQL Tuesday Retrospective #003: Relationships appeared first on Born SQL. The post T-SQL Tuesday Retrospective #003: Relationships appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 755

Anonymous
04 Nov 2020
2 min read
Save for later

Daily Coping 4 Nov 2020 from Blog Posts - SQLServerCentral

Anonymous
04 Nov 2020
2 min read
I started to add a daily coping tip to the SQLServerCentral newsletter and to the Community Circle, which is helping me deal with the issues in the world. I’m adding my responses for each day here. Today’s tip is to find a new perspective on a problem you face. I’m struggling with some motivation lately. This is conference season, and even with the pandemic, it seems like I have no shortage of deadlines, but I also seem to be finding lots of content to watch. I find myself constantly inspired by seeing what others are doing with technology. And yet, I am struggling to actually do something. Mostly, while I dabble, I find myself struggling to focus and generate some feeling of accomplishment. Really, I feel like I’m not really getting anywhere with a huge variety of new things happening around me. I’m falling behind. I bet many people feel the same way, but really, I am trying to turn this around. I am learning a few new things, and getting the chance to work with tools, and so I have turned my view. Rather than looking at the large group of things I’m missing, and what percentage I am not working with, I am trying to focus on the thing I am working on, and valuing the skills I do build. The post Daily Coping 4 Nov 2020 appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 546
Anonymous
04 Nov 2020
3 min read
Save for later

PASS Virtual Summit 2020: I'm (Virtually) Presenting from Blog Posts - SQLServerCentral

Anonymous
04 Nov 2020
3 min read
I'll be presenting at 7AM Central Time in the first timeslot of the main three-day PASS Virtual Summit 2020 conference. It's long been a goal of mine to present at the best of all international SQL conferences, and this is the first year it happened for me, so I'm thrilled to be a part of it. It's not too late to register for the all-online event, with the same great quality content as always, at a fraction of the usual cost of going to Seattle. Like many (but not all) presentations at PASS Virtual Summit, my 75-minute presentation will feature roughly 60 minutes of pre-recorded (and painstakingly edited) content, with the rest of the time available for live Q&A with the speaker.  My presentation will cover a lot of important foundational material about security, accounts, authentication.  For folks new to SQL Server security design and administration, this will be a great foundation for your learning.  For those experienced in SQL admin, this will be a thorough evaluation of what you know, or thought you know, and maybe some gaps in what you know.  I think there is content in here to interest everyone in the SQL career lifecycle, and I’m not just guessing at that. I got my first DBA job in 2006. I’ve been giving a presentation on Security at User Groups and SQLSaturdays basics for years, it was one of the first topics I started speaking technically on a decade ago. As my own experience has deepened and broadened throughout my career, so has the content I build into this presentation.  So I’m going to start basic, and build quickly from there, focusing my content around common hurdles and tasks that database administrators face, in the hopes of deepening or broadening your experience, as well.  I'm setting the stage for a good conversation around security at PASS Virtual Summit 2020, especially around how permissions behave inside each database, how you can design database security, the relationships between logins, users, and databases. My session one of a four part Learning Pathway on security. We worked together over the past four months to make sure we're presenting a thorough conversation on security.  In subsequent presentations over the next three days: John Morehouse is presenting on Understanding Modern Data Encryption Offerings for SQL Server, including a lot more information on all the various sorts of encryption, plus some important security features of SQL Server like Transparent Data Encryption and Always Encrypted. Jeff Renz is presenting on Securing Your Data In Azure: Tips and Tricks, which covers Advanced Threat Detection, the Azure Key Vault, cloud secure connection strings, and certified data sets for security in PowerBI. Ed Leighton-Dick will cap it off with a presentation on Building a Security Dashboard for SQL Server, talking about when certs expire and what does that actually mean, more on Azure Advanced Threat Detection, SQL Audit and monitoring. The post PASS Virtual Summit 2020: I'm (Virtually) Presenting appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 566

Anonymous
04 Nov 2020
7 min read
Save for later

Consider the Benefits of Powershell for Developer Workflows from Blog Posts - SQLServerCentral

Anonymous
04 Nov 2020
7 min read
Who Am I Talking To You use bash or python. PowerShell seems wordy, extra verbose, and annoying. It’s a windows thing, you say… why would I even look at it. Pry bash out of my fingers if yuo dare (probably not for you ??) What PowerShell Is The best language for automating Windows… period. A great language for development tooling and productivity scripts. One of the best languages for automation with interactivity. Python is fantastic. The REPL isn’t meant for the same interactivity you get with PowerShell. PowerShell prompt is sorta like mixing Python & fish/bash in a happy marriage. A rich language (not just scripting) for interacting with AWS using AWS.Tools. A rich object-oriented pipeline that can handle very complex actions in one-liners based on object-oriented pipelines. Intuitive and consistent mostly for command discovery. a common complaint from bash pros. The point of the verbosity Verb-Noun is discoverability. tar for example is a bit harder to figure out than Expand-Archive -Path foo -DestinationPath foo A language with a robust testing framework for unit, integration, infrastructure, or any other kinda testing you want! (Pester is awesome) What PowerShell Isn’t Python ?? Good at datascience. Succinct Meant for high-concurrency Good at GUI’s… but come-on we’re devs… guis make us weak ?? A good webserver Lots more. The Right Tool for the Job I’m not trying to tell you never to use bash. It’s what you know, great! However, I’d try to say if you haven’t explored it, once you get past some of the paradigm differences, there is a rich robust set of modules and features that can improve most folks workflow. Why Even Consider PowerShell As I’ve interacted more and more with folks coming from a mostly Linux background, I can appreciate that considering PowerShell seems odd. It’s only recently that it’s cross platform in the lifecycle of things, so it’s still a new thing to most. Having been immersed in the .NET world and now working on macOS and using Docker containers running Debian and Ubuntu (sometimes Alpine Linux), I completely get that’s not even in most folks purview. Yet, I think it’s worth considering for developer workflows that there is a lot of gain to be had with PowerShell for improving the more complex build and development workflows because of the access to .NET. No, it’s not “superior”. It’s different. Simple cli bash scripting is great for many things (thus prior article about Improving development workflow Task which uses shell syntax). The fundemental difference in bash vs PowerShell is really text vs object, in my opinion. This actually is where much of the value comes in for considering what to use. Go For CLI Tools Go provides a robust cross-platform single binary with autocomplete features and more. I’d say that for things such as exporting pipelines to Excel, and other “automation” actions it’s far more work in Go. Focus Go on tooling that makes the extra plumbing and stronger typing give benefit rather than just overhead. AWS SDK operations, serverless/lambda, apis, complex tools like Terraform, and more fit the bill perfectly and are a great use case. Scenario: Working with AWS If you are working with the AWS SDK, you are working with objects. This is where the benefit comes in over cli usage. Instead of parsing json results and using tools like jq to choose arrays, instead, you can interact with the object by named properties very easily. $Filters = @([Amazon.EC2.Model.Filter]::new('tag:is_managed_by','muppets') $InstanceCollection = (Get-EC2Instance -Filter $Filters)).Instances | Select-PSFObject InstanceId, PublicIpAddress,PrivateIpAddress,Tags,'State.Code as StateCode', 'State.Name as StateName' -ScriptProperty @{ Name = @{ get = { $this.Tags.GetEnumerator().Where{$_.Key -eq 'Name'}.Value } } } With this $InstanceCollection variable, we now have access to an easily used object that can be used with named properties. Give me all the names of the EC2 instances: $InstanceCollection.Name Sort those: $InstanceCollection.Name | Sort-Object (or use alias shorthand such as sort) For each of this results start the instances: $InstanceCollection | Start-EC2Instance Beyond that, we can do many things with the rich eco-system of prebuilt modules. Here are some example of some rich one-liners using the power of the object based pipeline. Export To Json: $InstanceCollection | ConvertTo-Json -Depth 10 | Out-File ./instance-collection.json Toast notification on results: Send-OSNotification -Title 'Instance Collection Results' -Body "Total results returned: $($InstanceCollection.Count)" Export To Excel with Table: $InstanceCollection | Export-Excel -Path ./instance-collection.json -TableStyle Light8 -TableName 'FooBar' Send a rich pagerduty event to flag an issue: Send-PagerDutyEvent -Trigger -ServiceKey foo -Description 'Issues with instance status list' -IncidentKey 'foo' -Details $HashObjectFromCollection Use a cli tool to flip to yaml (you can use native tooling often without much issue!): $InstanceCollection | ConvertTo-Json -Depth 10 | cfn-flip | Out-File ./instance-collection.yml Now build a test (mock syntax), that passes or fails based on the status of the instances Describe "Instance Status Check" { Context "Instances That Should Be Running" { foreach($Instance in $InstanceCollection) { It "should be running" { $Instance.StatusName | Should -Be 'Running' } } } } Now you have a test framework that you could validate operational issues across hundreds of instances, or just unit test the output of a function. Exploring the Object I did this comparison once for a coworker, maybe you’ll find it useful too! "Test Content" | Out-File ./foo.txt $Item = Get-Item ./foo.txt ## Examine all the properties and methods available. It's an object $Item | Get-Member This gives you an example of the objects behind the scene. Even though your console will only return a small set of properties back, the actual object is a .NET object with all the associated methods and properties. This means that Get-Item has access to properties such as the base name, full path, directory name and more. You can access the actual datetime type of the CreationTime, allowing you to do something like: ($item.LastAccessTime - $Item.CreationTime).TotalDays This would use two date objects, and allow you to use the relevant Duration methods due to performing math on these. The methods available could be anything such as $Item.Encrypt(); $Item.Delete; $Item.MoveTo and more all provided by the .NET namespace System.IO.FileInfo. I know many of these things you can do in bash as well, but the object pipeline here I’d wager provides a very solid experience for more complex operations based on the .NET framework types available. Wrap Up This was meant to give a fresh perspective on why some folks have benefited from PowerShell over using shell scripting. It’s a robust language that for automation/build/cloud automation can give a rich reward if you invest some time to investigate. For me the basic “right tool for the job” would like like this: data: python serverless: go & python (powershell can do it too, but prefer the others) web: go & python basic cli stuff: shell (using Task which uses shell syntax) complex cli project tasks: powershell & go automation/transformation: powershell & python high concurrency, systems programming: go Maybe this provided a fresh perspective for why PowerShell might benefit even those diehard shell scripters of you out there and maybe help convince you to take the plunge and give it a shot. #development #cool-tools #golang #automation The post Consider the Benefits of Powershell for Developer Workflows appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 552

article-image-pro-microsoft-power-platform-solution-building-for-the-citizen-developer-from-blog-posts-sqlservercentral
Anonymous
04 Nov 2020
2 min read
Save for later

Pro Microsoft Power Platform: Solution Building for the Citizen Developer from Blog Posts - SQLServerCentral

Anonymous
04 Nov 2020
2 min read
Over the last several months a team of excellent authors, including myself, have been writing a very exciting new book about Microsoft’s Power Platform. We approached the publishing company Apress with an idea to produce a book that really tells the full story of how the Power Platform works together. As I’m sure you know, the Power Platform is actually 4 tools in one: Power Apps, Power Automate, Power BI and Power Virtual Agent. We found there were few books on the market that attempted to tell this full story. This book is designed for the “Citizen Developer” to help you feel confident in developing solutions that leverage the entire Power Platform. Are you a Citizen Developer? Citizen Developers are often business users with little or no coding experience who solve problems using technologies usually approved by IT. The concept of business users solving their own problems is not new but, what is new is the concept of doing it with IT’s blessing. Organizations have realized the power of enabling Citizen Developers to solve smaller scale problems so IT can focus larger more difficult problem. I hope you enjoy this new book and find it helpful in your Power Platform journey! The post Pro Microsoft Power Platform: Solution Building for the Citizen Developer appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 834
Anonymous
03 Nov 2020
6 min read
Save for later

Improving Local Development Workflow With Go Task from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
6 min read
Workflow Tooling Development workflow, especially outside of a full-fledged IDE, is often a disjointed affair. DevOps oriented workflows that often combine cli tools such as terraform, PowerShell, bash, and more all provide more complexity to getting up to speed and productive. Currently, there is a variety of frameworks to solve this problem. The “gold standard” most are familiar with in the open-source community would be Make. Considering Cross-Platform Tooling This is not an exhaustive list, it’s focused more on my journey, not saying that your workflow is wrong. I’ve looked at a variety of tooling, and the challenge has typically that most are very unintuitive and difficult to remember. Make…it’s everywhere. I’m not going to argue the merits of each tool as I mentioned, but just bring up that while cMake is cross platform, I’ve never considered Make a truly cross platform tool that is first class in both environments. InvokeBuild & Psake In the Windows world, my preferred framework would be InvokeBuild or PSake. The thing is, not every environment will always have PowerShell, so I’ve wanted to experiment with minimalistic task framework for intuitive local usage in a project when the tooling doesn’t need to be complex. While InvokeBuild is incredibly flexible and intuitive, there is an expectation of familarity with PowerShell to fully leverage. If you want a robust framework, I haven’t found anything better. Highly recommend examining if you are comfortable with PowerShell. You can generate VSCode tasks from your defined scripts and more. InvokeBuild & Psake aren’t great for beginners just needing to run some tooling quickly in my experience. The power comes with additional load for those not experienced in PowerShell. If you are needing to interact with AWS.Tools SDK, complete complex tasks such as generating objects from parsing AST (Abstract Syntax Trees) and other, then I’d lead towards InvokeBuild. However, if you need to initialize some local dependencies, run a linting check, format your code, get the latest from main branch and rebase, and other tasks that are common what option do you have to get up and running more quickly on this? Task I’ve been pleasantly surprised by this cross-platform tool based on a simple yaml schema. It’s written in go, and as a result it’s normally just a single line or two to immediately install in your system. Here’s why you might find some value in examining this. Cross-platform syntax using this go interpreter sh Very simple yaml schema to learn. Some very nice features that make it easy to ignore already built assets, setup task dependencies (that run in parallel too!), and simple cli interactivity. My experience has been very positive as I’ve found it very intuitive to build out basic commands as I work, rather than having to deal with more more complex schemas. Get Started version: 3 tasks: default: task --list help: task --list fmt: desc: Apply terraform formatting cmds: - terraform fmt -recursive=true The docs are great for this project, so I’m not going to try and educate you on how to use this, just point out some great features. First, with a quick VSCodee snippet, this provides you a quick way to bootstrap a new project with a common interface to run basic commands. Let’s give you a scenario… assuming you aren’t using an already built Docker workspace. I need to initialize my 2 terraform directories. I want to also ensure I get a few go dependencies for a project. Finally, I want to validate my syntax is valid among my various directories, without using pre-commit. This gets us started… version: 3 tasks: Next, I threw together some examples here. Initialize commands for two separate directories. A fmt command to apply standardized formatting across all tf files. Finally, wrap up those commands with a dep: [] value that will run the init commands in parallel, and once that is finished it will run fmt to ensure consistent formatting. version: '3' env: TF_IN_AUTOMATION: 1 tasks: init-workspace-foo: dir: terraform/foo cmds: - terraform init init-workspace-bar: dir: terraform/bar cmds: - terraform init fmt: desc: Recursively apply terraform fmt to all directories in project. cmds: - terraform fmt -recursive=true init: desc: Initialize the terraform workspaces in each directory in parallel. deps: [init-workspace-foo,init-workspace-bar] cmds: - task: fmt You can even add a task in that would give you a structured git interaction, and not rely on git aliases. sync: desc: In GitHub flow, I should be getting lastest from main and rebasing on it so I don't fall behind cmds: - git town sync Why not just run manually I’ve seen many folks online comments about why even bother? Can’t the dev just run the commands in the directory when working through it and be done with it? I believe tasks like this should be thrown into a task runner from the start. Yes, it’s very easy to just type terraform fmt, go fmt, or other simple commands… if you are the builder of that project. However: it increases the cognitive load for tedious tasks that no one should have to remember each time the project grows. It makes your project more accessible to new contributors/teammates. It allows you to simply moving to automation by wrapping up some of these automation actions in GitHub Actions or equivalent, but simply having the CICD tooling chosen run the same task you can run locally. Minimal effort to move it to automation from that point! I think wrapping up things with a good task runner tools considers the person behind you, and prioritizes thinking of others in the course of development. It’s an act of consideration. Choose the Right Tooling Here’s how I’d look at the choices: Run as much in Docker as you can. If simple actions, driven easily on cli such as build, formatting, validation, and other then start with Task from the beginning and make your project more accessible. If requirements grow more complex, with interactions with AWS, custom builds for Lambda, combined with other more complex interactions that can’t easily be wrapped up in a few lines of shell scripting… use InvokeBuild or equivalent. This gives you access to the power of .NET and the large module collection provided. Even if you don’t really need it, think of the folks maintaining or enabling others to succeed with contributions more easily, and perhaps you’ll find some positive wins there. ?? #development #cool-tools #golang #automation The post Improving Local Development Workflow With Go Task appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 531

article-image-external-tables-vs-t-sql-views-on-files-in-a-data-lake-from-blog-posts-sqlservercentral
Anonymous
03 Nov 2020
4 min read
Save for later

External tables vs T-SQL views on files in a data lake from Blog Posts - SQLServerCentral

Anonymous
03 Nov 2020
4 min read
A question that I have been hearing recently from customers using Azure Synapse Analytics (the public preview version) is what is the difference between using an external table versus a T-SQL view on a file in a data lake? Note that a T-SQL view and an external table pointing to a file in a data lake can be created in both a SQL Provisioned pool as well as a SQL On-demand pool. Here are the differences that I have found: Overall summary: views are generally faster and have more features such as OPENROWSET Virtual functions (filepath and filename) are not supported with external tables which means users cannot do partition elimination based on FILEPATH or complex wildcard expressions via OPENROWSET (which can be done with views) External tables can be shareable with other computes, since their metadata can be mapped to and from Spark and other compute experiences, while views are SQL queries and thus can only be used by SQL On-demand or SQL Provisioned pool External tables can use indexes to improve performance, while views would require indexed views for that Sql On-demand automatically creates statistics both for a external table and views using OPENROWSET. You can also explicitly create/update statistics on files on OPENROWSET. Note that automatic creation of statistics is turned on for Parquet files. For CSV files, you need to create statistics manually until automatic creation of CSV files statistics is supported Views give you more flexibility in the data layout (external tables expect the OSS Hive partitioning layout for example), and allow more query expressions to be added External tables require an explicit defined schema while views can use OPENROWSET to provide automatic schema inference allowing for more flexibility (but note that an explicitly defined schema can provide faster performance) If you reference the same external table in your query twice, the query optimizer will know that you are referencing the same object twice, while two of the same OPENROWSETs will not be recognized as the same object. For this reason in such cases better execution plans could be generated when using external tables instead of views using OPENROWSETs Row-level security (Polybase external tables for Azure Synapse only) and Dynamic Data Masking will work on external tables. Row-level security is not supported with views using OPENROWSET You can use both external tables and views to write data to the data lake via CETAS (this is the only way either option can write data to the data lake) If using SQL On-demand, make sure to read Best practices for SQL on-demand (preview) in Azure Synapse Analytics I often get asked what is the difference in performance when it comes to querying using an external table or view against a file in ADLS Gen2 vs. querying against a highly compressed table in a SQL Provisioned pool (i.e. managed table). It’s hard to quantify without understanding more about each customers scenario, but you will roughly see a 5X performance difference between queries over external tables and views vs. managed tables (obviously, depending on the query, that will vary but that’s a rough number – could be more than 5X in some scenarios). A few things that contribute to that: in-memory caching, SSD based caches, result-set caching, and the ability to design and align data and tables when they are stored as managed tables. You can also create materialized views for managed tables which typically bring lots of performance improvements as well. If you are querying Parquet data, that is in a columnstore file format with compression so that would give you similar data/column elimination as what managed SQL clustered columnstore index (CCI) would give, but if you are querying non-Parquet files you do not get this functionality. Note that for managed tables, on top of performance, you also get a granular security model, workload management capabilities, and so on (see Data Lakehouse & Synapse). The post External tables vs T-SQL views on files in a data lake first appeared on James Serra's Blog. The post External tables vs T-SQL views on files in a data lake appeared first on SQLServerCentral.
Read more
  • 0
  • 0
  • 1054