Pages

Wednesday, 13 June 2018

CReW - Who would cross the Bridge of Death must answer me these questions three


#Alteryx18 marked the beginning of a journey for me.  I've long been on this quest, but now take the lead as Adam focuses his efforts on family and Alteryx e2.  My first CReW action was the creation of new SWAG.  These die-cut stickers were gobbled up in Anaheim and I've reserved more for my friends at #Alteryx18 in Europe.

Second on my list of accomplishments was the assumption of the CReW Macro training (only scheduled in Anaheim for #Alteryx18).  Together with Joe Miller from Alteryx we either wowed the attendees or we gave sufficient pleas for good survey results (9.6/10.0).  Putting CReW macros to the "test" and opening eyes about how to create value from these gems really sunk home as users (beginner to expert) were enlightened by their seeming simplicity, but awesome results.  "Oooohs and Aaaahs" abounded as we delighted the students with tips and tricks.  Defensive configuration and use of CReW tools took center stage as Joe provided step by step instructions and I added colour to the session.























Where do we go from here?  I want to hear from you!  What ideas and what needs do you have?  I've created some beta macros (they are in the macro district of the Alteryx gallery), but CReW is more than just one person.  Whether your an Alteryx guru or a newbie I can use your skills.  Macros need naming, icons need designing, and we also need to manage the CReW inventory.  Please let me know your thoughts and availability to join the CReW.  

I'm very honored and excited to carry this torch.  Join me and let's make Adam proud.

Cheers,
Mark

Wednesday, 2 May 2018

Passing The Baton

The theme of this year’s Inspire conference is “Alter Everything” and Alteryx has truly changed my life, it has brought me new friends, new skills, given me the chance to live on a different continent and through all of that the opportunity to build the product that I love.

But times change, and with that said I am sad to announce that I am stepping down as the curator of the CReW macros. Which if I am honest I think already happened a couple of years ago, it has just taken me this long to realise it and make it official. There is no big reason, my life has just changed: I have a family who I want to spend my free time with and the time I don’t spend with them I want to get out of the house/office and exercise or see friends (and there is this little project at work I am working on that really eats up all of my thinking time right now). So apologies to everyone who has sent me bug fixes or new macros over the last couple of years and joined my list of “things I need to get to” I realise now I am not going to get to that list.

But in thinking of what next for the CReW macros there really was only one person who I could think I might pass the reigns to (that pun just wrote itself). A long term contributor to the macros, a good friend and all round legend of Alteryx, Mr. Mark Frisch or as some of you may know him MarqueeCrew. I will let Mark post a welcome post here soon with his ideas for the macros, but all there is left for me to do is thank Mark for taking over and for all of you for all of your suggestions and help with the macros over the last 4 years.

Hope to see many of you at one of the Inspire conferences this year.

Adam
(Former curator of the CReW macros)



Wednesday, 14 June 2017

Inspire Las Vegas


Last week I had the pleasure of attending my 9th Alteryx Inspire conference at the Aria hotel in Las Vegas, Nevada.  As always I enjoyed meeting all of our customers, partners and Aces and left the conference feeling exhausted but inspired.

Here are my top 10 personal highlights from the event:

1. The Solution Center - This remains my all time favourite part of Inspire.  A chance to meet with customers and help solve their challenges with Alteryx.  When I'm not booked in anywhere else this is where I spend all of my time.  Came away this year with ideas for a couple of new CReW macros, plus I got a couple of Alteryx Aces to help me out with a problem I was having on a personal Alteryx project.

2. 9 Inspires but 6 years - Explaining to people how I'd been to 9 Inspire conferences, but only been with Alteryx for 6 years (One conference as a client, an extra for London last year and fence post counting)

3. Training - Very excited this year to be able to deliver a training course on the CReW macros. 
Despite a few installation issues surrounding apostrophes in the install path, we got through the macros I wanted to cover and I hope that everybody learnt something.  Along with my course there was a huge range of other training courses which I know were very much enjoyed.



4. CReW Macros in the Keynotes - Was very pleased that the CReW macros were mentioned in both Dean's and the product keynote.  Great to see the recognition for the work we have done as a community building these macros.

5. Mark Frisch named Community Top Contributor of the Year - When the award category came up on the screen I knew there was only one person who was going to win it this year.  Fantastic to see all the amazing work Mark does in the community being recognised.

6. New CReW icon - Created by the very talented Tara McCoy the CReW macro pack now has its own icon.  Thanks Tara!



7. Alteryx Periodic Table - More awesome artwork from Tara and given away at Inspire: The Alteryx Tools as a periodic table!  Check it out and get your own copy over on the community!

8. Closing Keynote - I always enjoy the closing keynotes at Inspire and this year was no exception with an entertaining talk from Freakonomics author Stephen Dubner.  Some great lessons in story telling and data collecting.

9. Alteryx Card Game - I enjoy playing card games, but not sure I have ever featured in one before!



10. Ned's Keynote - Great as always to see Ned talk about the product.  As an employee of a public listed company I am unable to repeat here what he covered.  But if you are interested there are still tickets available for Inspire Europe in London in September!  Hope to see you there.

Thursday, 2 June 2016

Crew Macro Pack - 2016 Q2 Release

The latest release of the CReW macro pack is available for download now at


NOTE: You will require Alteryx 10.0 or later to use this macro pack.

For installation instructions see this page

WHAT'S NEW?


New Macros


Wait A Second - This month I am very excited to be able to introduce another new contributor to the CReW macro pack: Daniel Brun.  Who as well as contributing this macro here has also released his own macro pack "The Inviso macro pack for Alteryx" available for download here.

The Wait A Second macro is a different spin on the throttle tool which allows a user to specify a wait time between records passing through the tool.


Blocking Test Tool - This macro came directly from a client request and is actually something which I have seen asked for many time before.  It is very useful when updating a external data source and you want to check some condition before you write *any* data out.  It checks all of the data against the given expression and only if all of the records pass that expression does it then pass its records downstream.  If any record fails the expression then an error message is thrown and no records are output.


The Test Macros

These are four new macros which were born out of a series of blog posts I wrote back in 2015 about trying test driven development for Alteryx macros.  That experiment actually hit some stumbling blocks which I am yet to write up about, but the macros that were born out of it live on and are very useful for testing in a TDD style or otherwise.

I am releasing them in this release with a BETA flag as they are still very much a work in progress and I know I still have some work to do on them.  However I believe they are in a good enough condition to get some other people looking at them and feeding back any issues and improvements they might find.

They are under their own tool category "CReW Test" which you can hide for now if you want to wait for them to evolve some more.

They are
Test Runner - This is the macro which runs all of your tests.  You generally have a workflow containing only this macro.  Running that workflow will run all of your tests.







Expect Equal - This macro takes two data streams (Expected and Actual) and throws a (hopefully useful) error message if they differ.

This is the macro where I know I have some work to do and where you as a community can help out.  I need any examples you can find where this tool isn't doing what you expect.  So data sets you think are equal, but it thinks are different and vice versa.


Expect Zero - This macro expects to receive zero records into it and if it doesn't throws an error telling you.







Expect Error - This macro is for checking that workflows correctly error.  An essential part of testing.  If it is placed within a tool container then it expects a tool in that same tool container to throw the expected error message.  If the correct error message is not thrown then the Test Runner will mark that test as failed.
I intend a follow up article on how to use the macros, but to get started quickly take a copy of the GettingStartedTemplate folder in TestMacros, rename it to something meaningful and run the contained RunTests.yxmd to run the sample tests.

You can also take a look at the CReW macro test suite which you can find in Samples\CReW Macros\Tests.  Again a work in progress, but the starting of what is to follow.

As always please let me know about any issues/feedback you have and hopefully see many of you at Inspire next week!

(If you are not registered already what are you waiting for? http://www.alteryx.com/inspire-2016)

Friday, 9 October 2015

Crew Macro Pack - 2015 Q3 Release

It it with great pleasure that I can announce the next release of the CReW macro pack.

It is available for download now at


Or for anyone still on 9.5


It has been a long time coming I know, but I hope you enjoy the new content.  I'm pleased to say this pack has three new contributors from the community, something which I hope we can continue with future releases.

Installation Instructions


  1. Download the zip file above
  2. Extract it to a location on your machine where you want to keep the macros.  (Not in the Program Files/Alteryx folder or an existing Macros folder)
  3. Run Install.yxwz

Community Supported


These macros are not an official Alteryx release and as such they have not been through the full testing process that tools in the product have.  Some of them are prototypes and experimental, but that is some of what makes them so fun.  

If you find an issue let me know either in the comments below or drop me personally a note and I'll take a look, or if you can see how to fix it yourself then even better!

What's New?


New Macros

Wildcard XLSX Input - This macro was contributed by Joe Mako and came from a discussion on this community post.  Ever wanted to read in multiple XLXS files and worksheets?  Then this is the macro for you!  (This is the only macro in the pack which is currently missing from the 9.5 version as it makes use of 10.0 features)

Description - "Reads in multiple sheets from multiple Excel files from a directory. Optionally can filter out sheets, can pull field names from a header row, can auto configure field types, can sample N random sheets, and if the headers do not match then raw data for the sheets that did not match will be in the 'N' output."


Chi Square Test - Our second new macro author this month, this one is from Mark Frisch or MarqueeCrew another very active member of the new community site.

Description - "Have you created a sample set and wanted to know if there was a bias in the sample?  Random doesn't always mean you get an even quantity of data across your variables.  This test will check to see if the distribution of domain values (for a specific variable) are representative of the parent population.

Best to double check your samples before you run your tests!"


Moving Summarize - This is a macro I wrote almost a year ago now and featured on this blog here.  But it really has been that long since the last macro pack release!

Description - "Calculate moving averages or any type of moving summarize"





Trim Mean - This macro was written by me and was inspired from the recent Alteryx roadshow in London, with a user requesting this functionality which is available in Excel.

Description - "Returns the mean of the interior of a data set. Trim Mean calculates the mean taken by excluding a percentage of data points from the top and bottom tails of a data set. You can use this macro when you wish to exclude outlying data from your analysis."

Bug Fixes


Missing Tool quick descriptions - A big thanks to David Zurbuchen for both spotting these were missing, but better still taking the time and effort to fix them.  A great example of how you can contribute to the macro pack with small but important fixes.

Runner Macro fixes - There were a number of issues with the runner macros in the first release of them.  Many thanks to all of the people who both reported them and spent time with me helping to work out exactly what it was that caused them.  There is one outstanding issue that I know of which involves uni-code characters in the module path.

Add Totals Macro - Now works when there are spaces in the field names.

Get Involved


I want to thank again the contributions from Joe, Mark and David.  You guys have given me the motivation to get this going again.

And on that note I want to encourage anyone else who is interested in getting involved to get in touch either in the comments here or via email.  I really want to make this a community effort and try and make the releases a bit more regular.  There is plenty to do on the existing macros, samples, help files and I also want to try and make some more progress on my Test Suite macros.

Till next time.

Thursday, 23 April 2015

Testing Alteryx Macros - Part 4 - Testing For Errors

At the end of my last post we decided the next test to write was

"If a user tries to total a string column they get a meaningful error message."

This is going to require some new functionality to the test framework; as for this test to pass we will need the Alteryx module for it to fail (and not just fail, but fail with a very specific message).  I am going to start with how a user of the test framework (i.e. a macro developer) would do this and then we'll take a quick look at how the test framework works for this feature.

Write a Failing Test

For this test we will use a new macro in the test framework the Expect Error macro:


And we write a test which looks like this:

  • The Text Input has a single Text field called field 1 containing "Some Text"
  • The Expect Error macro has a single configuration option "Expected Error Message" which we set to the error message we expect to get in this scenario "Not possible to total Field1 as it is a non-numeric type: String"
Now if we run our test runner module we get the following output


So we get a message telling us what error message we were expecting to get from which tool(s) and then messages telling us what error messages we actually got (in this case the standard formula message about trying to use a string when a number was expected).

We can make one slight improvement to this test, in that we know exactly which tool we are expecting the error message to come from, so we should specify that in the test.  We do that by enclosing the Expect Error tool in a tool container with which ever tools we expect the error to come from:

Giving us the following log from the test runner:


Make The Test Pass

Having a failing test we can now add functionality to the macro to make it pass.  Which is a fairly straightforward process:


  • We already had a Field Info tool which gave us the types of all of our fields.
  • New filter tool filters off any fields of type String.
  • Then a message tool throws the correct error message to the macro user configured like so:

Run the Test Runner again and:


Success! We have written our first unit test which tests for an error.

So How Does It Work?

I said at the start that after we had used the new functionality we would take a look at how it worked.  This detail is not necessary to be able to successfully use the test framework, but might be interesting to see what's going on behind the scenes.

The Expect Error Macro

The contents of the Expect Error Macro are probably the most boring Alteryx macro you will ever see.  It looks like this:
That's right apart from the question it is empty!  All of the "magic" happens in the Test Runner macro.

The Test Runner Macro

This is looking a bit more complicated than last time we saw it and actually probably could do with being re-factored into a few sub macros:


  1. The macro here looks inside each of the Test Modules and parses out the Tool XML.  It is looking for Test Modules which contain the Expect Error macro.
  2. Modules which don't contain the Expect Error macro go this top route and get run as we saw previously.  If the module errors it fails the test and reports to the output log.
  3. Modules which do contain the Expect Error macro are also run with the Runner macros but this route the test fails if the test module runs without errors.
  4. This section again parses out the module XML and works out what error messages and from what tools the Expect Error macro is expecting.
  5. This join tool matches the expected error messages against the error messages which came out of the log.
  6. The final section of tools deals with reporting any mismatches between the expected and actual error messages and reporting the success or failure of the test.

Wednesday, 18 March 2015

Testing Alteryx Macros - Part 3 - Comparing Data Values

OK time to write some more tests.  But first we need to add some more functionality to our "Expect Equal" test macro.  If you recall from out last post, currently all it does is check if the column names match between the expected and actual data.  Our next series of tests are going to consider the data in those tables which is going to need some new functionality to tell us if the data differs.

It looks like this:

The old functionality is still there above these tools, but I omitted it from the screenshot for clarity.

So what is this doing?  Well the Record ID tools are adding a row number onto both datasets and then the transpose tool uses this as a key and all other fields (importantly including unknowns) as data fields.

The Join then joins the two streams together on the row number and field name.

Finally the message tool is configured:


Whenever the data differs we get a nice error message:

Message (30) Unexpected Value - Field:Total Row:1 Expected:1 Actual:0

With this in place we can write our next tests.  So just a quick reminder of the TDD work cycle:
  1. Write a test for a piece of functionality which will fail.
  2. Write some code which makes the test pass.
  3. (Optionally) Re-factor your code.  Without adding any new functionality and without breaking any tests.
and we are off.

Write a Failing Test

The next smallest piece of functionality I could think of is this:

"Single Field Named Field1 With Value 1 Gives Total of 1"

I add a new module to my test folder called Single Field Named Field1 With Value 1 Gives Total of 1.yxmd and copy the module from my first test


All I change is the incoming test data to

and the Expected data to

Then I save the module and run my test runner which gives me this log


Great a failing test with a clear and precise error message.

Make The Test Pass

So next the smallest amount of code to make the test pass.  Easy open up our Add Totals macro


And just change the hard coded formula value from 0 to 1.  Run the tests again



Success!

Write a Failing Test

Well I can quickly see where my current implementation is going to fall down.  What if my user's data contains something other than the value 1?  So let us write a test to exercise that:

"Single Field Named Field1 With Values 1,2,3 Gives Totals 1,2,3"

So again copy and paste one of our existing tests and update the data

Incoming test data

Expected data
And run the tests

Fails as expected and again a pleasing and exact error message.

Make The Test Pass

Again easy enough to make this test pass.  We just update our formula to [Field1]

and


Oh dear.  Our new test is passing, but our first test "Adds A Column Called Total" has failed.  What's happened?

So the error message is telling us that our new formula is giving a string result.  If we recall our first test our fields didn't have any data in them at all.  This means that the text input tool has no data to use to determine the type of Field1 so defaults to a type of V_WString.  So yes trying to use that in a formula which should result in a double is an error.

But that's not a very good error for users of our macro.  It refers to internal tools in our macro and doesn't tell the user what to do to fix it.  We can do better than that, but not right now.  Right now we are working on making our tests pass having added a new one.  We shouldn't be adding any new functionality.

In TDD it is often helpful to have a small list somewhere (I tend to scribble on a pad on my desk) of tests that you are going to add in the near future, then when you find things like this you can simply add them to the list and get back to the actual problem you are working on.  So we'll do just that

Tests To Add
- If a user tries to total a string column they get a meaningful error message.

This is some of the strength in the TDD approach you are exploring ways in which your macro will be used and can break as you develop it.

So back to the problem at hand.  Our first test was never meant to test anything about wrong field types, it was just supposed to check the Total column got added.  So let's just fix up the test with a couple of select tools to force our columns to be numeric types

and


All good!

Write a Failing Test

"Two columns Field1, Field2 With Values 1,2,3 Gives Totals 2,4,6"

Again the test looks identical to our previous ones just with updated Test and Expected data.



Make The Test Pass

Now we're beginning to get to some real implementation.  The naive fix of updating our formula to [Field1] + [Field2] isn't going to cut it as it would fail on the tests which don't have Field2.

I need to dynamically get the names of my fields, use them to build up my formula and then apply that formula to my data. 

Which implemented looks like this

Quick overview from left to right:
  • The field info tool gets all of the field names
  • The summarize tool builds the total formula by concatenating the field names with a + separator
  • The formula tool sets up the Type, Size and Field Name that we need for the dynamic formula tool
  • The dynamic formula tool applies the formula to the data
Run the tests


and not only do we know we have implemented out latest piece of functionality, but we can also see that we have not broken any functionality that we previously added.

Next Time

Now we are beginning to get something that is beginning to look like it might be useful.  Looking at my list of future tests I think now is going to be a good point to tackle that better error message if my user tries to total a string field, which we will look at in my next post.