nocolumnname.blog Open in urlscan Pro
192.0.78.139  Public Scan

Submitted URL: http://nocolumnname.blog/
Effective URL: https://nocolumnname.blog/
Submission: On February 23 via api from GB — Scanned from GB

Form analysis 1 forms found in the DOM

<form id="jp-carousel-comment-form">
  <label for="jp-carousel-comment-form-comment-field" class="screen-reader-text">Write a Comment...</label>
  <textarea name="comment" class="jp-carousel-comment-form-field jp-carousel-comment-form-textarea" id="jp-carousel-comment-form-comment-field" placeholder="Write a Comment..."></textarea>
  <div id="jp-carousel-comment-form-submit-and-info-wrapper">
    <div id="jp-carousel-comment-form-commenting-as">
      <fieldset>
        <label for="jp-carousel-comment-form-email-field">Email</label>
        <input type="text" name="email" class="jp-carousel-comment-form-field jp-carousel-comment-form-text-field" id="jp-carousel-comment-form-email-field">
      </fieldset>
      <fieldset>
        <label for="jp-carousel-comment-form-author-field">Name</label>
        <input type="text" name="author" class="jp-carousel-comment-form-field jp-carousel-comment-form-text-field" id="jp-carousel-comment-form-author-field">
      </fieldset>
      <fieldset>
        <label for="jp-carousel-comment-form-url-field">Website</label>
        <input type="text" name="url" class="jp-carousel-comment-form-field jp-carousel-comment-form-text-field" id="jp-carousel-comment-form-url-field">
      </fieldset>
    </div>
    <input type="submit" name="submit" class="jp-carousel-comment-form-button" id="jp-carousel-comment-form-button-submit" value="Post Comment">
  </div>
</form>

Text Content

Skip to content


NO COLUMN NAME

My learnings and thoughts on SQL Server and PowerShell

Menu
 * About Me:expand child menu
   * Twitter:
   * DBA.stackexchange
 * Community Scripts
 * T-SQL Tuesday
 * SQL Server
 * PowerShell

 * Twitter


ONLY ONE JOIN-PATH IS NEEDED

Time to read: ~ 3 minutes

Words: 571



Update: Learning from my mistakes aka Failing Up

Update: Reliably informed that `-AdditionalChildPath` was added after 5.1


JOIN ME FOR A MOMENT

There’s a multitude of scripts out in the wild with a chain of Join-Path
commands. Initially, when I wanted to create a path safely, a Join-Path cmdlets
chain was also my go-to. However, after reading up on the documentation, I
realised another way: I only need a singular instance of the Join-Path command.


TARGET LOCATION

My PowerShell console is open in my home folder, and I’ve a test file:
/home/soneill/PowerShell/pester-5-groupings/00-run-tests.ps1.

If I wanted to create a variable that goes to the location of that file, one of
the safe ways of doing that is to use Join-Path.


LONG FORM

I mean, I could create the variable myself by concatenating strings, but then
I’d have to take the path separator into account depending if I’m on Windows or
not.

Apparently not…

1
2
3
4
5
6
7
8
$var = ".\PowerShell\pester-5-groupings\00-run-tests.ps1"
 
[PSCustomObject] @{
  Type      = 'Long Form'
  Separator = 'Manual entry: \'
  Variable  = $var
  Path      = try {Get-ChildItem -Path $var -ErrorAction Stop} catch {'Error!'}
}

Forward becomes back

I thought this wouldn’t work but, when running the code samples, it appears that
PowerShell doesn’t mind me using a forward-slash (/) or a back-slash (\); it’ll
take care of the proper separator for me.

UPDATE: This way works fine from a file but run the script from a PowerShell
terminal and it’s a no-go.

No, you’re not the one for me

UPDATED UPDATE: Thanks for Cory Knox (twitter) and Steven Judd (twitter) for
pointing out that this fails because it’s using /bin/ls instead of the
Get-ChildItem alias:




MANUAL CREATION

A more explicit, cross-platform method would be to use the
[IO.Path]::DirectorySeparatorChar.

1
2
3
4
5
6
7
8
9
$sep = [IO.Path]::DirectorySeparatorChar
$var = ".${sep}PowerShell${sep}pester-5-groupings${sep}00-run-tests.ps1"
 
[PSCustomObject] @{
  Type      = 'Manual Creation'
  Separator = "[IO.Path]::DirectorySepartorChar: $sep"
  Variable  = $var
  Path      = try {Get-ChildItem -Path $var -ErrorAction Stop} catch {'Error!'}
}

The long way around

This method works fine but creating the path can get very long if I don’t use a
variable. Even using a variable, I have to wrap the name in curly braces because
of the string expansion method I used. That’s not something that I would expect
someone picking up PowerShell for the first time to know.


-F STRINGS

In case you’re wondering, another string expansion method here would be to use
-f strings.

1
2
3
4
5
6
7
8
9
$sep = [IO.Path]::DirectorySeparatorChar
$varf = '.{0}PowerShell{0}pester-5-groupings{0}00-run-tests.ps1' -f $sep
 
[PSCustomObject] @{
  Type = 'F String'
  Separator = "[IO.Path]::DirectorySepartorChar: $sep"
  Variable  = $varf
  Path      = try {Get-ChildItem -Path $varf -ErrorAction Stop} catch {'Error!'}
}

It’s hard to google for the F word


MANY JOIN-PATH COMMANDS

Better yet would be if I didn’t have to account for the separator at all. Here’s
where the multiple Join-Path cmdlets come into play.

1
2
3
4
5
6
7
8
$var2 = Join-Path -Path . -ChildPath PowerShell | Join-Path -ChildPath
pester-5-groupings | Join-Path -ChildPath 00-run-tests.ps1
  
[PSCustomObject] @{
  Type      = 'Many join paths'
  Separator = 'Taken care of: Join-Path'
  Variable  = $var2
  Path      = try {Get-ChildItem -Path $var2 -ErrorAction Stop} catch {'Error!'}
}

One, Two, Many, Lots

Multiple Join-Path commands work fine. No real issue with people using this way,
but there is another!


ONLY ONE JOIN-PATH NEEDED

Join-Path has a parameter called -AdditionalChildPath that takes the remaining
arguments from the command line and uses them in much the same way as a
Join-Path command chain would.

1
2
3
4
5
6
7
8
$var3 = Join-Path -Path . -ChildPath PowerShell -AdditionalChildPath
'pester-5-groupings', '00-run-tests.ps1'
 
[PSCustomObject] @{
  Type = 'AdditionalChildPaths'
  Separator = 'Taken care of: Join-Path'
  Variable = $var3
  Path = try {Get-ChildItem -Path $var3 -ErrorAction Stop} catch {'Error!'}
}

One join to rule them all…


MORE OUTPUT THAN PUT OUT

So there you go—more than one way to join a path. Use whichever ones work for
you. It’s good to know your options, though.

Author Shane O'NeillPosted on January 25, 2022January 28, 2022Categories
PowershellTags DirectorySeparatorChar, f-string, Join-Path, PowershellLeave a
comment on Only One Join-Path Is Needed


TABLE COLUMN DIFFERENCES PART 03 – COMPARE-SQLTABLECOLUMNS

Words: 470

Time to read: ~ 2 minutes


DON’T TALK TO ME ABOUT IT!

Four years ago (I know, where did the time go?), I wrote about Table Column
Differences with T-SQL and PowerShell.

A Michal commented on the post, asking how to get a specific output from his
search.

> Hi,
> 
> Thanks for your sharing. What if I also want to compare case sensitively
> columns and the order of them (syncwindows). How can I presented it on
> powershell.
> 
> I mean that in the final table I want to show also something like: column_a,
> column_A –> case sensitive
> 
> AND
> 
> column_a, column_a –> different order in the table
> 
> Thanks in advance
> 
> Michal

I confess that I never got around to answering Michal until a few weeks ago when
I found myself with some rare free time.

Since then, I’ve written a script, slapped it into a function, and threw it up
on Github.

Here’s hoping that it does what you want this time Michal, thanks for waiting.


SHALL I COMPARE THEE TO ANOTHER TABLE?

The first thing that we need to do is have a couple of SQL tables to compare.

So, I threw up a Docker container and created a couple of tables with nearly the
same layout.

1
2
3
(Get-DbaDbTable -SqlInstance localhost -Table 'dbo.DifferenceTable01',
'dbo.DifferenceTable02').Columns |
        Select-Object -Property Parent, Name, ID, DataType |
        Format-Table -GroupBy Parent

I’m liking the new PowerShell formatting

You can see that there are around three differences here

 1. Column orders, e.g. col9 has id 6 in dbo.DifferenceTable01 but id 5 in
    dbo.DifferenceTable02.
 2. Column case sensitivity, e.g. col7 does not match COL7.
 3. Column presence, e.g. col3 doesn’t exist in dbo.DifferenceTable01 at all.

While Compare-Object has the -CaseSensitive switch, I don’t think that it would
be helpful in all these cases. Or else I didn’t want to use that command this
time around.

So, I wrote a function to get the output we wanted, and yes, I now include
myself among that list of people wishing for that output.

I’m allowed to be biased towards the things that I write 🙂


COMPARE-SQLTABLECOLUMNS

1
2
Compare-SqlTableColumns -SqlInstance localhost -Table1 'dbo.DifferenceTable01'
-Table2 'dbo.DifferenceTable02' |
        Format-Table



I’ve tried to include everything you could want in the function output, i.e.
column names, column ids, and statuses.

Something I’ve started to do lately is wrapping a [Diagnostics.StopWatch] in my
verbose statement to see where potential slow parts of the function are.

I’d like to think that 0.2 seconds for this example aren’t too bad.

1
2
3
$x = Compare-SqlTableColumns -SqlInstance localhost -Table1
'dbo.DifferenceTable01' -Table2 'dbo.DifferenceTable02' -Verbose
 
$x | Format-Table




THOU HAST LESS COLUMNS THAN THINE BROTHER…

Feel free to use and abuse this function to your hearts content. I know that
there are a few things that I’d add to it. Comparing across different instances
being an obvious one that I’d like to put in.

Hopefully though, someone out there will find it helpful.

Here’s looking at you, Michal.

Author Shane O'NeillPosted on November 15, 2021November 15, 2021Categories
dbatools, Powershell, sql-serverTags compare, Compare-SqlTableColumns,
Powershell, Script, sql-serverLeave a comment on Table Column Differences Part
03 – Compare-SqlTableColumns


T-SQL TUESDAY #143 – SHORT CODE EXAMPLES

Time to read: ~ 2 minutes

Words: 328



Welcome to T-SQL Tuesday, the monthly blog post invitational where we’re given a
topic and asked to write about it.

This month we have John McCormack (Blog | Twitter) asking, “What are your go-to
handy scripts“?

For this post, I’m going to break these down into different languages.


SQL

I once had the annoyingly complex T-SQL to change MS format time into a
human-readable format memorised.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
SELECT
    time_MS_format = [TimeMSFormat],
    converted_time = '2021-10-12 ' +
    STUFF(
        STUFF(
            RIGHT('000000' + X.TimeMSFormat, 6), 3, 0, ':'
        ), 6, 0, ':'
    )
FROM (VALUES
     ('00000')
   , ('00500')
   , ('01000')
   , ('10000')
   , ('10500')
   , ('100000')
   , ('100500')
   , ('115500')
   , ('120000')
) X ([TimeMSFormat]);



Then I read a blog post from Kenneth Fisher (Blog | Twitter) about the in-built
msdb database function dbo.agent_datetime.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
SELECT
    time_MS_format = [TimeMSFormat],
    new_function = msdb.dbo.agent_datetime(20211012, X.TimeMSFormat)
FROM (VALUES
     ('00000')
   , ('00500')
   , ('01000')
   , ('10000')
   , ('10500')
   , ('100000')
   , ('100500')
   , ('115500')
   , ('120000')
) X ([TimeMSFormat]);



If I run sp_helptext on that function, it reminds me of that Andy Mallon (Blog |
Twitter) post.

 * Stop using sp_hexadecimal & sp_help_revlogin

It would be more performant if I stripped that function and used the code
directly but the code is too handy to use for the infrequent times I need it.


POWERSHELL

I’ve talked before about using ConvertTo-SqlSelect in a blog post before and I
still use that function alot!

> ConvertTo-SQLSelect

Another short piece of code that I use is more for formatting than anything
else. You can populate a variable with an array of properties names.
Select-Object can use this variable to return information.

1
2
3
$Properties = 'SqlInstance', @{Name = 'DatabaseName'; Expression = {$_.Name}},
'Status', 'RecoveryModel', 'Owner'
 
Get-DbaDatabase -SqlInstance localhost -SqlCredential $Cred | Select $Properties



A useful snipper for reporting is to use a combination of Sort-Object and the
Format-* commands with the -GroupBy parameter.

1
2
3
4
Get-DbaDatabase -SqlInstance localhost -SqlCredential $Cred |
    Select $Properties |
    Sort-Object RecoveryModel |
    Format-Table -GroupBy RecoveryModel




SIN É

When I sit down and write this post, I realise that I don’t have a lot of handy
scripts. Either I re-write things constantly (that’s likely), or I don’t know
enough yet (also likely). I should fix that.

Author Shane O'NeillPosted on October 12, 2021Categories Powershell, sql-server,
T-SQL TuesdayTags agent_datetime, ConvertTo-SQLSelect, Format-Table,
Sort-Object1 Comment on T-SQL Tuesday #143 – Short code examples


THE SURPRISING WORKING OF TRIMEND

Time to read: ~ 2 minutes

Words: 397

A couple of days ago, I was running some unit tests across a piece of PowerShell
code for work and a test was failing where I didn’t expect it to.

After realising that the issue was with the workings of TrimEnd and my thoughts
on how TrimEnd works (versus how it actually works), I wondered if it was just
me being a bit stupid.

So I put a poll up on Twitter, and I’m not alone! 60% of the people answering
the poll had the wrong idea as well.



Let’s have some code to show what we mean.

1
'Shanes_sqlserver'


INCORRECT IDEAS

The vast majority of code that I have seen out in the wild has strings as the
inner portion of TrimEnd

1
'Shanes_sqlserver'.TrimEnd('sqlserver')


The code works how I thought that it would, removing the “sqlserver” portion of
the string at the end. Now, let’s try it again and remove the underscore as
well.

1
'Shanes_sqlserver'.TrimEnd('_sqlserver')



See! Where has my “s” and “e” gone?!

Let’s look at the overload definitions for TrimEnd by running the code without
the brackets after the method.

1
'Shanes_sqlserver'.TrimEnd


No overload definition takes a string; they either take a char or an array of
chars. Is that what’s happening here?

1
2
3
4
5
6
7
8
# Takes an array of chars
'Shanes_sqlserver'.TrimEnd('_', 's', 'q', 'l', 'e', 'r', 'v')
 
# Turns a string into an array of chars
'Shanes_sqlserver'.TrimEnd('_sqlerv')
 
# Order doesn't matter either
'Shanes_sqlserver'.TrimEnd('vrelqs_')


A NEW WAY OF THINKING

So TrimEnd takes the characters that we provide inside the method and removes
them from the end until it reaches the first non-matching character.

This example explains why our first example, with TrimEnd('sqlserver'), removes
everything up to the underscore.

1
2
'Shanes_sqlserver'.TrimEnd('sqlserver')
# -----^ First non-matching character (_)




However, when we include the underscore, the first non-matching character
shuffles back.

1
2
'Shanes_sqlserver'.TrimEnd('_sqlserver')
# --^ First non-matching character (n)


INITIAL PROBLEM

Now that we have a new understanding of how TrimEnd works, how can we remove the
“_sqlserver” part of the string?

Split it in two.

1
2
3
'Shanes_sqlserver'.TrimEnd('sqlserver').TrimEnd('_')
# -----^  First non-matching character (_)
# ----^  First non-matching character after first TrimEnd (s)

This rewrite works for us since we have a defined character that acts as a
stop-gap. If that stop-gap isn’t possible, then -replace may be our best option.

1
'Shanes_sqlserver' -replace '_sqlserver'

Always good to get a better understanding of PowerShell. If my tests catch more
of these misunderstandings that I can learn from, then I’m OK with that!

Author Shane O'NeillPosted on October 6, 2021October 6, 2021Categories General,
PowershellTags shanes_sqlserver, TrimEnd2 Comments on The Surprising Working of
TrimEnd


T-SQL TUESDAY #140: WHAT HAVE YOU BEEN UP TO WITH CONTAINERS?

Time to read: ~ 2 minutes

Words: 335




KUBERNETES

So this is a post that will not be educational, but it’s the latest encounter
that I’ve had with containers, so it’s the most present in my mind.
Hey, hopefully, it brings a laugh to some people.

I’ve been looking into Kubernetes. I’ve not gotten very far with it, but I
managed to set up a replica in Ubuntu WSL2 on my laptop.


Everything was all well and good apart from being unable to connect to the
database from Azure Data Studio but again, all good.

Fast forward a couple of days where I’m trying to share screen, and my laptop
started getting very slow, the fans started getting very loud, and the
performance just tanked.

Taking a look at the ol’ Task Manager, I saw a “vmmem” process taking a massive
amount of memory. A quick google search led to the culprit being virtual
machines.

Here started what I can only describe as a Benny Hill sketch where I tried to
remove the pods only to have the Kubernetes create the pods again!

Remove the pods – check for pods – the same amount created a few seconds ago!
Argh!!!


CONTAINERS

Eventually, I dropped the pods and managed to get my laptop under control.
Still wanting to have a SQL instance to work with, I managed to spin up a Docker
container and have a developer instance of SQL 2019 up and running on my laptop.

Thankfully I know enough about containers to stop the instance when I don’t need
it and only start it up again when I do.

It’s strange to think that the day has arrived where I resort back to my
knowledge of containers as the familiar option!
There’s a good thing in there somewhere, maybe put a backstop into my learnings?
Just enough to know how to stop if the situation goes wrong or go too far.

I still intend to continue researching Kubernetes, but maybe I’ll deepen my
knowledge on Containers in the meantime.



Author Shane O'NeillPosted on July 13, 2021Categories General, T-SQL TuesdayTags
containers, kubernetes, wsl2 Comments on T-SQL Tuesday #140: What have you been
up to with containers?


CONVERTTO-SQLSELECT

Words: 651

Time to read: ~ 3 minutes

Update 2021-06-17: It now accepts pipeline input

It’s been a busy month for me so there’s not a lot of outside work research that
has been going on.

That being said, there has been quite a gap since I wrote a blog post so I
figured that I might as well write something

So what do I have to write about?


SELECT STATEMENTS

There are times when I want to mess about with data in SQL Server, data that I
have obtained in PowerShell. This will require some way to get the information
from PowerShell into SQL Server.

I know of a few ways to do this.


DBATOOLS

There is the dbatools module and the Write-DbaDbTableData function.

Get-Help -Name Write-DbaDbTableData -Full



If I wanted to write the properties of 50 modules from PSGallery into SQL
Server, I can use the function handy enough.

Find-Module | Select-Object -First 50 | Write-DbaDbTableData -SqlInstance localhost -Database WAT -Table dbatools_Insert -WhatIf


IMPORTEXCEL

There is also the ImportExcel module and the ConvertFrom-ExcelToSQLInsert
function.

Get-Help -Name ConvertFrom-ExcelToSQLInsert -Full



Find-Module | Select-Object -First 50 | Export-Excel -Path .\Documents\Excel\temp_20210614.xlsx;
ConvertFrom-ExcelToSQLInsert -TableName ImportExcel_Insert -Path .\Documents\Excel\temp_20210614.xlsx -UseMsSqlSyntax




BEING PICKY

Both of these were a bit too much for me though. I only wanted a quick and easy
way to have the data available in a SELECT statement.

I can use ImportExcel and ConvertFrom-ExcelToSQLInsert but that is dependent on
the table already existing, never mind having to save the data in an Excel file
first.

Don’t get me wrong – I’m aware that you don’t need Excel installed on the
computer where you’re running these commands from. You still need to save the
files somewhere though. The function doesn’t take data from variables.

I can use dbatools and Write-DbaDbTableData. This function is not dependent on
the table having to already exist. It will create the table for you if you tell
it to. Thank you -AutoCreateTable; even though I recommend pre-sizing your
columns if you want to go with this method.

However, I don’t want to have to create the table beforehand.


CONVERTTO-SQLSELECT

So I wrote a primitive function to have the data available in a SELECT statement
that I can run in an SSMS or Azure Data Studio window.

You can find the code for it here on Github:
ConvertTo-SQLSelect

I can pass a bunch of objects into it and it will create the SELECT for me using
the good ol’ VALUES clause.

Although I’m pretty sure this is basically what ORMs do under the cover before
people who knew what they were doing looked at them…

ConvertTo-SQLSelect -Data (Find-Module | Select-Object -First 50)

… there’s more data here….



CAVEATS

There are a couple of caveats to be aware of…

 * It doesn’t allow pipeline input.

It probably could but that would require a sit-down and think about how to do
it. Like I said; this was a quick and dirty put-together function.

It now accepts pipeline input – although I’m sure it isn’t the best way I could
have implemented that…

-999..1000 | ForEach-Object -Process { (Get-Date).AddDays($_) } | ConvertTo-SQLSelect

 * There are no data types.

There are strings and they get inserted as strings but that’s okay for me for a
quick playthrough. Any data conversions, I can do once I have the data in an
SSMS window.

 * It doesn’t like single quotes

Yeah, I have no real excuse for this one. I should really fix that before I use
this function again…

It can handle single quotes now


 * There is also no help comments for this.

There should be, even though there is only one parameter. There should also be
tests! I am filled with good intentions that are yet to see fruition though…

That being said, I’ve had to use it a few times already that has meant that
writing it has already paid off.

So feel free to use, abuse, and/or improve it as you see fit.

I hope you find it useful.

Author Shane O'NeillPosted on June 16, 2021June 18, 2021Categories dbatools,
Powershell, sql-serverTags ConvertFrom-ExcelToSQLInsert, ConvertTo-SQLSelect,
dbatools, ImportExcel, SELECT, Write-DbaDbTableData2 Comments on
ConvertTo-SQLSelect


PESTER 5 AND GROUP-OBJECT – BEST FRIENDS

Figuring out how to group the output of your Pester tests

Words: 830



Time to read: ~ 4 minutes



I’ve been working with Pester v5 lately.

Pester v5 with PowerShell v5 at work & Pester v5 with PowerShell Core outside of
work.

There are quite a few changes from Pester version 3, so it’s almost like
learning a new language… except it’s based on slang. I think that I’m speaking
eloquently, and then I’ve suddenly insulted someone and Pester no longer wants
to play nice with me.


INITIAL TESTS

I’ve got the following data that I’m using to test Pester v5.

BeforeDiscovery -ScriptBlock {
    $Groups = @(
        [PSCustomObject] @{
            Server = 1
            Group = 'A'
            Value = '86b7b0f9-996f-4c19-ac9a-602b8fe4d6f2' -as [guid]
        }, 
        [PSCustomObject] @{
            Server = 1
            Group = 'B'
            Value = 'e02913f7-7dae-4d33-98c9-d05db033bd08' -as [guid]
        },
        [PSCustomObject] @{
            Server = 2
            Group = 'A'
            Value = '96ad0394-8e9e-4406-b17e-e7d47f29f927' -as [guid]
        },
        [PSCustomObject] @{
            Server = 2
            Group = 'B'
            Value = 'f8efa8b6-e21b-4b9c-ae11-834e79768fee' -as [guid]
        }
    )
}

Test data

Usually, I would only use -TestCases to iterate through the data. I know that in
Pester v3, I could wrap the It blocks inside a foreach () {}, and it would be
okay. Hell, in most of my testings, it was faster. It doesn’t matter; I liked
using -TestCases, and the performance difference is negligible to me.

That is still an option with Pester v5. I can run the below code to confirm.

Describe -Name 'Attempt: 01' -Tag '01' -Fixture {
    Context -Name 'Server: <_.Server>' -Fixture {
        It -Name 'should have a guid for its value: <_.Value>' -TestCases $Groups {
            $_.Value | Should -BeOfType [guid]
        }
    }
}

ForEach on the It block

If I look at the data, I can see that I’ve got two different values for Server;
1 and 2. It would be great if I could group the tests by those server values.

For me, Pester has three main blocks; Describe, Context, and It.
I know that Pester v5 has a -ForEach parameter for each of these blocks. I’ve
already tried using the -ForEach parameter against the It block, and it didn’t
do what I wanted.

Reminder of the ForEach on the It block

I’ll try it against the Context block instead and see if it works.


Describe -Name 'Attempt: 02' -Tag '02' -Fixture {
    Context -Name 'Server: <_.Server>' -Foreach $Groups {
        It -Name 'should have a guid for its value: <_.Value>' -Test {
            $_.Value | Should -BeOfType [guid]
        }
    }
}

ForEach on the Context block

That kind of works but we’ve got the same server in two different groups. Let’s
move the groups up to the Describe level.

Describe -Name 'Attempt: 03 - Server: <_.Server>' -Tag '03' -Foreach $Groups {
    Context -Name 'Server: <_.Server>' -Fixture {
        It -Name 'should have a guid for its value: <_.Value>' -Test {
            $_.Value | Should -BeOfType [guid]
        }
    }
}

ForEach on the Describe block

We’ll that’s not what I wanted. Instead of 1 describe block, we have multiple
blocks; 1 per group.


GROUPED DATA

Now, I’m going to start using Group-Object. My data by itself doesn’t seem to
work.

$Groups = @(
    [PSCustomObject] @{
        Server = 1
        Group = 'A'
        Value = '86b7b0f9-996f-4c19-ac9a-602b8fe4d6f2' -as [guid]
    }, 
    [PSCustomObject] @{
        Server = 1
        Group = 'B'
        Value = 'e02913f7-7dae-4d33-98c9-d05db033bd08' -as [guid]
    },
    [PSCustomObject] @{
        Server = 2
        Group = 'A'
        Value = '96ad0394-8e9e-4406-b17e-e7d47f29f927' -as [guid]
    },
    [PSCustomObject] @{
        Server = 2
        Group = 'B'
        Value = 'f8efa8b6-e21b-4b9c-ae11-834e79768fee' -as [guid]
    }
)

Results of the Test Data

We can pass that data into Group-Object to group our data by a certain property.
In my case, I want to group the data by the Server property.

$Groups | Group-Object -Property Server

Grouped Test Data

Taking a look at the first group, I only have the data for that single property
value.

($Groups | Group-Object -Property Server)[0].Group

Inside the first group of Test Data

Now, I’ll try the Pester code again.


GROUPED TESTS

First, I’ll try putting the groups into the It blocks and see if that works.

Describe -Name 'Attempt: 05' -Tag '05' -Fixture {
    BeforeDiscovery -ScriptBlock {
        $GroupedGroups = $Groups | Group-Object -Property Server
    }

    Context -Name 'Server: <_.Name>' -Fixture {
        It -Name 'should have a guid for its value: <_.Group.Value>' -ForEach $GroupedGroups {
            $_.Group.Value | Should -BeOfType [guid]
        }
    }
}

Grouped on the It block

It doesn’t fully work. The data is grouped but the results seems to be
concatenating the values. I’d like it better if they were split out to separate
tests per value.

This time, I’ll group the data in the context blocks and then pass the groups
into the It blocks. I’ll do this by passing the groups into the -ForEach
parameter of the It block using $_.Group.

Describe -Name 'Attempt: 04' -Tag '04' -Fixture {
    BeforeDiscovery -ScriptBlock {
        $GroupedGroups = $Groups | Group-Object -Property Server
    }

    Context -Name 'Server: <_.Name>' -ForEach $GroupedGroups {
        It -Name 'should have a guid for its value: <_.Value>' -TestCases $_.Group {
            $_.Value | Should -BeOfType [guid]
        }
    }
}

Grouped on Context and passed to It block

In the previous code blocks, I used the BeforeDiscovery block in the Describe
block. If you don’t want to use that, you can pass the Group-Object cmdlet to
the ForEach parameter as a subexpression.

Describe -Name 'Attempt: 06 - Server: <_.Name>' -Tag '06' -ForEach ($Groups | Group-Object -Property Server) {
    Context -Name 'Server: <_.Name>' -Fixture {
        It -Name 'should have a guid for its value: <_.Value>' -TestCases $_.Group {
            $_.Value | Should -BeOfType [guid]
        }
    }
}

Without using BeforeDiscovery on the Describe block


PASS OR FAIL

I’ve encountered this obstacle of grouping objects in tests a couple of times.
I’m hoping that by writing this down, I’ll be able to commit the information to
memory.

Hey, if it doesn’t, I can always grab the code and figure it out.

Author Shane O'NeillPosted on May 17, 2021May 16, 2021Categories Pester,
PowershellTags Group-Object, Pester, Pester v3, Pester v52 Comments on Pester 5
and Group-Object – Best Friends


T-SQL TUESDAY #135: THE OUTSTANDING TOOLS OF THE TRADE THAT MAKE YOUR JOB
AWESOME

Welcome to T-SQL Tuesday, the brainchild of Adam Machanic ( twitter ) and ward
of Steve Jones ( blog | twitter ).
T-SQL Tuesday is a monthly blogging party where a topic gets assigned and all
wishing to enter write about the subject.
This month we have Mikey Bronowski ( blog | twitter ) asking us about the most
helpful and useful tools we know of or use.

Tools of the trade are a topic that I enjoy. I have a (sadly unmaintained) list
of scripts from various community members on my blog. This list is not what I’m
going to talk about though. I’m going to talk about what to do with or any
scripts.

I want to talk about you as a person and as a community member. Why? Because you
are the master of your craft and a master of their craft takes care of their
tools.


STORE THEM

If you are using scripts, community-made or self-made, then you should store
them properly. By properly, I’m talking source control. Have your tools in a
centralised place where those who need it can access it. Have your scripts in a
centralised place where everyone gets the same changes applied to them, where
you can roll back unwanted changes.

Check out Brett Miller’s ( blog | twitter ) presentation “GitOps – Git for Ops
people“.


VERSION THEM

If you are using community scripts, then more likely than not, they are
versioned. That way you’re able to see when you need to update to the newest
version. No matter what language you’re using, you can add a version to them.

PowerShell has a ModuleVersion number, Python has __version__, and SQL has
extended properties.

Or even take a page out of Bret Wagner’s ( blog | twitter ) book and try XML
comments.


TAKE CARE OF THEM

If you take care of these tools, if you store them, version them, and make them
accessible to those who need them, then they will pay you back a hundredfold.
You’ll no longer need to re-write the wheel or pay the time penalty for
composing them. The tools will be easy to share and self-documented for any new
hires.
Like the adage says: Take care of your tools and your tools will take care of
you.



Author Shane O'NeillPosted on February 9, 2021February 10, 2021Categories T-SQL
TuesdayTags Powershell, sql, tools1 Comment on T-SQL Tuesday #135: The
outstanding tools of the trade that make your job awesome


START-INCREASINGPUSHBACK

Words: 498

Time to read: ~3 minutes

Table of Contents

 1. Intro
 2. Try Again
 3. Smarter
 4. Challenge 01
 5. Challenge 02
 6. Scripting
 7. Using the Script


INTRO

If you look back over some of the posts that I wrote in October this year, you
may have realised that there was a motif going on.

I used a homebrew pushup tracker as a data source for a couple of blog posts. A
group of friends and I were attempting to “push out” (excuse the pun) 3,000
pushups over the month.

Spoilers: We didn’t reach the target. 


TRY AGAIN

I’m okay with failure. If you learn from your failures, then I don’t even
consider them as failures. This scenario didn’t fall into this case, though. The
only reasons that I could think that I didn’t reach the target are:

 1. I started after nearly a week into the month had passed, and
 2. I tried to do too much, too fast, in as little rounds as possible per day.

So, with these lessons under my belt, I decided to try again.


SMARTER

I figured that it was simple enough to fix my first mistake, I’d start on the
first day of the month this time.

The second mistake was something that I figured would also be simple. Rather
than attempting to do as many as I could in as little rounds as possible, I’d do
ten sets a day and that was it. If I focus more on the process than the goal, I
figured that it would get me over the line eventually.


CHALLENGE 01

If I do a set every half hour, I’d have the ten completed in 5 hours. I mean,
hey, we’re in lockdown. I have 5 hours to spare.

But I didn’t.

Work, meetings, calls, focus and flow all sapped the time away from me.

So I tried again.

I’ve started getting up early in the mornings do to research and blog posts
(like this one for example), so I’d try and get them done then.

Ten sets every 5 minutes should have me completed in just under an hour; more
than enough time to spare.


CHALLENGE 02

Pushups are hard! Even when I’m not trying to rep out as many as I can, they
still take a toll on the body. Soon a five-minute break is not enough, and I’m
taking longer and longer rests.

Fine, if that’s the way we’re going to do this, then I’m going to go with the
flow.


SCRIPTING

Seeing as I needed a little extra rest each round, I decided to create a
PowerShell script that would help calculate that rest for me.

https://github.com/shaneis/RandomScripts/blob/master/Start-IncreasingBackoff.ps1


USING THE SCRIPT

For once, I’ve added comment based help to my script so I can run

1
Get-Help -Name Start-IncreasingBackup -Examples

and get examples of what the script does!



Now, I can run this script and get a timer that will let me know when to start
my activities and that will give me more and more rest each time!

Now to see what else I can use this for!

Author Shane O'NeillPosted on January 1, 2021December 31, 2020Categories
General, PowershellTags pushups, start-increasingpushback, timespan1 Comment on
Start-IncreasingPushback


UNCOMMON SQL

Words: 612

Time to read: ~ 3 minutes

Table of Contents

 1. Intro
 2. ODBC Date Functions
 3. INSERT Alias
 4. Default Option
 5. = DEFAULT
 6. What about with no default?
 7. CURRENT
 8. And so forth


INTRO

Recently the DBA Team Lead and I were reviewing some SQL code, and we came
across some SQL that neither of us had frequently encountered before. This led
to a brief watercooler moment where we shared some uncommon SQL that we had
seen. Perfect blog post material, I think.


ODBC DATE FUNCTIONS

From a previous post, I talked about ODBC date functions.

I’m using AdventureWorks2014 here.

1
2
3
4
5
6
7
8
9
10
/* The 10 employees who have been the longest at the company */
SET NOCOUNT ON;
 
SELECT TOP (10) WITH TIES
        HE.JobTitle,
        HE.HireDate,
        {d '2006-06-30'} AS start_of_company,
        DATEDIFF(DAY, {d '2006-06-30'}, HE.HireDate) AS days_since_company_start
FROM    HumanResources.Employee AS HE
ORDER BY    days_since_company_start;




INSERT ALIAS

An unexpected item that we found recently was that INSERT INTO statements care
about correct column names. That’s all though, nothing else seems to faze them.
This means that you can add the most ridiculous aliases or part names to the
column and SQL Server won’t care. As far as I can tell, it will just ignore
them.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
/* Prefixes get prefixed */
SET NOCOUNT ON;
 
IF OBJECT_ID(N'dbo.Hires', N'U') IS NOT NULL
BEGIN
    DROP TABLE dbo.Hires;
END;
 
CREATE TABLE dbo.Hires (
    hire_id int IDENTITY(1, 1) NOT NULL
        CONSTRAINT [PK dbo.Hires hire_id] PRIMARY KEY CLUSTERED,
    job_title varchar(50) NOT NULL,
    hire_date datetime2(7) NOT NULL,
    is_on_salary bit NULL
        CONSTRAINT [DF dbo.Hires is_on_salary] DEFAULT (0)
);
 
TRUNCATE TABLE dbo.Hires;
 
WITH OldestHires AS (
SELECT TOP (10) WITH TIES
        HE.JobTitle AS job_title,
        HE.HireDate AS hire_date,
        ROW_NUMBER() OVER (ORDER BY HE.HireDate) AS rn
FROM    HumanResources.Employee AS HE
ORDER BY    HE.HireDate
)
INSERT INTO dbo.Hires (
    [0].[1].[2].[3].[4].[5].[6].[7].[8].[9].job_title,
    a.b.c.d.e.f.g.h.i.j.k.l.m.n.o.p.q.r.s.t.u.v.w.x.y.z.hire_date,
    [1/0].[%].[OUT_OF_BOUNDS].[   ].is_on_salary
)
SELECT  OH.job_title,
        OH.hire_date,
        CASE
            WHEN OH.rn % 3 = 0 THEN NULL
            ELSE 1
        END AS is_on_salary
FROM    OldestHires AS OH;
 
SELECT  *
FROM    dbo.Hires;
GO




DEFAULT OPTION

Let’s contrive an example. Let us say that we have a table called dbo.Hires and
we’ve added a column called is_on_salary.
Since most of the hires are salaried, we have added a new default constraint
setting the value to 0.
Unfortunately, it looks like the default constraint hasn’t been applied yet…

1
2
3
4
5
/* Our dbo.Hires table */
SET NOCOUNT ON;
 
SELECT  *
FROM    dbo.Hires;




= DEFAULT

Recently, my DBA Team Lead pointed me to a piece of code where the syntax was:
UPDATE T SET COLUMN = DEFAULT

Now, I had never seen this before, and I wasn’t quite sure that this method
would work. I wasn’t wholly surprised, though when a quick test proved that it
does.

1
2
3
4
5
6
7
8
9
/* UPDATE DEFAULT */
SET NOCOUNT ON;
 
UPDATE  dbo.Hires
SET     is_on_salary = DEFAULT
WHERE   is_on_salary IS NULL;
 
SELECT  *
FROM    dbo.Hires;




WHAT ABOUT WITH NO DEFAULT?

Okay, that seems to add the default constraint to a column. What about when
there is no defined constraint on the column. Will it error out then?

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
/* Removing our default constraint */
ALTER TABLE dbo.Hires
    DROP CONSTRAINT [DF dbo.Hires is_on_salary]
 
SELECT  'Pre update' AS [status],
        *
FROM    dbo.Hires;
 
UPDATE  dbo.Hires
SET     is_on_salary = DEFAULT
WHERE   is_on_salary = 0;
 
SELECT  'Post update' AS [status],
        *
FROM    dbo.Hires;



Nope! As mentioned in the docs – if there is no default, and the column can
become NULL, then NULL will be inserted.


CURRENT

Finally, we have CURRENT.
While the vast majority of scripts manually define the database context for
commands, such as ALTER DATABASE AdventureWorks, etc., you can tell SQL Server:
Hey! Use the current database context!

1
2
3
4
5
6
7
8
9
10
/* CURRENT Database Context */
SET NOCOUNT ON;
 
ALTER DATABASE AdventureWorks2014 SET PAGE_VERIFY NONE;
 
SELECT 'Pre change' AS [status], [name], page_verify_option_desc FROM
[sys].[databases] WHERE [name] = N'AdventureWorks2014';
 
ALTER DATABASE CURRENT SET PAGE_VERIFY CHECKSUM;
 
SELECT 'Post change' AS [status], [name], page_verify_option_desc FROM
[sys].[databases] WHERE [name] = N'AdventureWorks2014';




AND SO FORTH

Thre’s probably a lot more but these are the ones that we talked about. If you
have any uncommon SQL, let me know!

Author Shane O'NeillPosted on November 20, 2020November 20, 2020Categories
General, SQL New Blogger, sql-server4 Comments on Uncommon SQL


POSTS NAVIGATION

Page 1 Page 2 … Page 16 Next page


FOLLOW ME ON TWITTER

 * About Me:expand child menu
   * Twitter:
   * DBA.stackexchange
 * Community Scripts
 * T-SQL Tuesday
 * SQL Server
 * PowerShell

 * Twitter

No Column Name Powered by WordPress.com.
No Column Name
Proudly powered by WordPress Theme: Twenty Sixteen.
 

Loading Comments...

 

Write a Comment...
Email Name Website