SchemaPuker: How it came to be

If you haven’t seen my post about SchemaPuker, check it out here.

The story begins last year, when a colleague of mine David Everitt built a handy tool for generating ERDs. It was essentially a visualforce page / controller that allowed you to choose objects and then it would output some text in the format of a PostgreSQL Schema file that you could then import in to Lucidchart.

PostgreSQL schema files are relatively easy to generate (as they are essentially plain text) and Lucidchart was the diagramming tool of choice where we worked, so this all made sense.

I saw this, and thought it was a brilliant idea. ERDs are something that are very often part of design documents, proposals, etc. Even if you are building new functionality, often you are using some, or all of the existing data model, so having a way to get this out of salesforce easily was very helpful.

You can read more about David’s tool at his blog, SlightlyTechnical, including how to try it yourself.

However, a visualforce page / apex class has its limitations.

  • If you were doing a discovery, perhaps you don’t have credentials to organisation you need to chart, or if you do perhaps you don’t have a sandbox, or permission to install anything in one
  • If you do have credentials and a sandbox, you then need to add the visualforce page and controller in to the org
  • It would just output the results into the page itself, making it harder to import into your charting tool

So I decided I would make a new version of the tool, plus it was a good excuse to play with the salesforce metadata API, which I hadn’t had a lot of exposure to at the time.

I decided I would throw together a Java application to do this, I had written plenty of little console based apps in the past, but had never done anything with a GUI, so this was yet another learning opportunity. I built the app using swing, the WSC, utilising the metadata API and the SOAP API to handle authentication.

The application worked fine and had all the same functionality as its visualforce counterpart, with the added bonus that it would generate a text file, rather than display the output. After that, I got busy with life and forgot about it all.

This year, after giving my blog a bit of a refresh and thinking about what I could write about, when I remembered the tool. I dug out the source code, looked at it, cringed and thought about how I could make this thing better.

The obvious solution here was a cloud based app. Something that required no installation or setup, and was easy to use. Given that I already had a my previous iteration written in Java (and Java is the language I am most comfortable with) heroku seemed like the best fit for hosting this.

Life got in the way again, and it wasn’t till after a trip to surfforce (see my writeup here) and a discussion with Dave Carroll from salesforce that I thought about it again.

Dave was telling me about the work he had done on the cli, and the plans to extend the tool. I told him about my at-the-time named ‘Salesforce ERD Tool’ I was planning to move to heroku. He suggested (quite rightly) that that was a rather boring name, and came up with the idea of calling it ‘SchemaPuker’, and the name was born.

After surfforce I decided I would tackle this. I had never written a java web-app, nor had I used a web framework or deployed anything to heroku before. So with yet another great learning opportunity I set about learning how to do this.

I chose Spring MVC as my framework, mostly due to the huge amount of documentation for it, its uncanny similarity to visualforce and Spring Boot, which made testing the app locally *really* easy, and allowed for no xml config files.

I decided I was going to use the salesforce lightning design system in for the UI of my application, it looks nice and there is an excellent guide available for it.

Next, was taking a look at authorisation. My previous tool used the SOAP API for authorisation, however this was not going to be suitable here. Using OAuth2 made much more sense (so much so that I made a post about it here).


Once I had authorisation sorted out, I was able to reuse most of the core of my original application, and once I had the UI tidied up, I had a minimum viable product. I do have some ideas for enhancements for the next version, such as graphical output, stored groups of objects and a better interface for choosing objects.

SchemaPuker: ERDs made easy

SchemaPuker can be accessed here:

Read on for more information about SchemaPuker!

Often, we need to produce diagrams of our organisation’s data model (aka. ERDs). This will be especially true for those of us who are consultants.

Perhaps you are doing a discovery or analysis and need a a copy of the current data model, or maybe you need a ‘current state’ and a ‘to be’ for comparison, or you are designing new functionality that connects with an existing data model, or documenting functionality after completion.

Now, salesforce does have a tool to visualise the data model, called Schema Builder, however this cannot export the model, nor can it be customised without actually changing the data model itself.

To solve this problem, I came up with… SchemaPuker! (thanks to David Carroll for the name! and to David Everitt for the idea in the first place!) For more about how it came to be, and the name click here

But for now, SchemaPuker is a fairly simple tool, It allows you to authorise to salesforce, get a list of your objects and export them as a PostgreSQL schema file. This file can be imported in to Lucidchart (and other tools) in order to generate an editable ERD.

The tool itself is very simple to use, first, navigate to, choose if you are using a Production/Developer Org or a Sandbox and click ‘Login’. You will then be asked to enter your salesforce credentials and to authorise SchemaPuker to access your org.

Screen Shot 2016-09-01 at 16.36.36

Once authorised, you will be given a list of objects inside your salesforce org. You then select the objects you wish to be in your ERD by holding down command (or crtl on windows/linux) and clicking, or by typing the API names in the ‘Selected Objects’ box


Once you click submit, you are given the PostgreSQL Schema. You can either copy/paste this into lucid chard, or click the ‘Download’ button below the output.


Next, log in to Lucidchart and create a new drawing, click ‘More Shapes’ at the bottom and then tick ‘Entity Relationship’ and press ‘Save’


Now, you can either import the downloaded file from SchemaPuker by pressing ‘Choose File’, or paste the output in to the box below. You can ignore steps one and two in the import window.


You will now see your salesforce objects in the sidebar just under the ‘Entity Relationship’ panel. You can drag the objects on and the relationships between the objects will be automatically created.


You can click add new shapes from the ‘Entity Relationship’ panel to extend your ERD as required.

Thats it! Please try it out and let me know how you go!

Please Note: This is still very much beta, and is ‘minimum viable product’. However I am working to improve it on a regular basis, and would love to hear your thoughts.
It is limited to ~30 objects per export and may crash in fun and exciting ways. The app does *not* store any data, nor does it make *any* changes to your salesforce org.

Fun with OAuth2

OAuth2 is a magical thing, it makes it *very* easy for users to login to your application without sharing their credentials with it. The actual authorisation of the user is handed over to the service they are authenticating against (e.g Facebook, Twitter, Salesforce) and you are given an ‘access token’ which which you can make requests to the service with. For more on OAuth, there is a good explainer here.

At the moment, I am working on an application that I hope will be useful for some of you. This application needs to authenticate to salesforce in order to use it’s APIs.

The last time I did salesforce auth, I used the Login/Password/Token method via the SOAP API. This method works, but it’s not ideal for a webapp. It’s fairly clunky, requires my app to handle the actual credentials and usually needs a token. It has huge the potential to be insecure and is a bad user experience.

So after much looking around, trying, failing, goolging, etc I finally found something brilliant…. The Scribe library. It handles the actual OAuth bits, this allows my login code to be very, very tiny.

The next piece of the puzzle is what to do with the returned JSON, unfortunately the Scribe library struggles to parse it. In order to access the APIs I am using the WSC, which uses a ‘ConnectorConfig’ object to pass authentication details when it makes calls. So I needed a way to take the JSON returned from OAuth and return a ‘ConnectorConfig’ object that I can use with the WSC.

This was actually pretty straightforward, I simply serialize the JSON to an object using the Google GSON library and construct the ‘ConnectorConfig’ from the result.

Once I have a connector config, I can make API calls with the WSC and build the rest of my application. I hope that if someone is in the same boat as I was last week that this post helps them out.

Feel free to leave any comments below 🙂

Salesforce Community Events – Surfforce

If you’re in the salesforce space, no doubt you have heard of some of their events. The biggest and most well known being Dreamforce. Perhaps you’ve been to a Dreamforce, or world tour or one of the other official events.

But perhaps something you didn’t know about were salesforce ‘Community Events’. These are events that are not run by themselves, rather, they are organised by the community (often sponsored by partners, ISVs, etc). Community Events are relatively new in the space, but they are picking up pace quickly, an excellent example of this was the London’s Calling event here in the UK (that I unfortunately didn’t make it to.. next year!)

So why am I talking about community events? Well, I went to my first one recently – Surfforce.

Surfforce was billed as ‘a salesforce user group with a difference’ and it certainly was. Held in Aberavon, Wales, the basic idea of the event was ‘lets go for a surf in the morning, then talk salesforce in the afternoon’. It was the brainchild of Shaun Holmes, who’s passion for both helping others, the community and salesforce is incredible.

The event was aimed at people new to the salesforce community, with several excellent speakers sharing their journeys within the salesforce world. As well as this, there was a focus on helping local charaties.

I only found out about the event about a week before, so lucky enough I was able to organise the trip down with Scott. Given my late coming to the party, I was not able to secure a spot in the surfing portion of the day, which suited me fine, I am from Australia after all and the water temperate in wales was a little different to what I am used to!

When those brave enough to strap on a wetsuit where finished in the ocean it was time for lunch, networking and chatting with sponsors.


After lunch, we were treated to some excellent talks. First was Danielle from the wave project, she took us through what the project was about, and the amazing impact that it has had on the kids in need who were able to take part. They are doing excellent work with kids in need, providing ‘surf therapy’, teaching them how to surf and helping them with mental health issues such as anxiey and depression. Thanks to surfforce, over £500 was raised to help them in their efforts, as well as the opportunity for 15 kids to take part in a surf lesson at the same time as the surfforce attendees.

We next heard from Anna, a local businesswoman and entrepreneur, who spoke of her humble beginnings in Poland during the cold war, and how she was able to make the most of what she had and how she was able to keep challenging herself to be better and better. She has won multiple awards and is CEO of two successful companies, her talk was definitely inspiring.

We then heard from Dave and Mike from salesforce, both of them very early employees, they gave a very informative presentation that went through the journey salesforce as a company has been on, from having a handful of customers in 1999, none of whom had to pay for licensing for the first year! (interesting side note, one of these early adopters was a previous employer of mine) to launching the AppExchange, running on a box under dave’s desk, and originally called the App Store (sound familiar?) to the multi billion dollar success they are today. This was a very interesting talk, and if you get a change to see/watch it I would highly recommend it.

After a brief break for lunch, we heard from several more excellent speakers. The first of whom, Louise spoke of her personal journey from someone who had no experience with salesforce (or computers really, she has a background in Literature) to becoming an Awesome Admin. She spoke of how she found that she had more of an interest in the systems she was working with, than the actual work itself, and how that when she has salesforce ‘forced’ upon her, she decided that she would learn as much as possible and make a go of it. Louise described how much the salesforce community has been a help to her, the sheer volume of resources out there and how inclusive and helpful people were.

Next up, Antonia took us through her journey to bring her to the position she is in today (Lead Consultant) and how that her journey and the salesforce community is anything but boring. She explained that using the salesforce platform, anyone who wants to try can become a developer thanks to the supportive community, excellent declarative tools and wealth of documentation.

Finally, Jodi spoke to us about her journey from Salesforce Admin to Consultant, with a presentation entirely of GIFs (no death by powerpoint here!) she spoke of how she was constantly looking for new challenges, from being an Administrator, to setting up a Centre of Excellence, to finally making the move into the consulting world. Her journey in particular is one that I think a lot of people in the salesforce consultancy world will be familiar with (I know I am, I started my salesforce journey as an Admin back in 2008)

Proceeding ended with drinks and networking. To spite not being the exact target audience, I think that I got quite a lot from attending Surfforce, I met a load of amazing people and got to be involved with, what I think, is an excellent concept.

Shaun, Kerry, the speakers, volunteers, sponsors and everyone else who worked so hard to get this event up and running deserve a huge pat on the back for what they achieved with this event. I think the Surfforce concept would fit in perfectly back home in Australia. Something like this could be easily done in both the Gold Coast and Sydney, and given the salesforce community in Australia, could be very successful. I hope that someone seriously considers this concept, and that the next event is even bigger and more successful than the last.

I think the concept of community events is a great one, it goes to show how inclusive the salesforce community is as a whole and how excited people are about the platform. Surfforce may have been my first community event, but it most definitely won’t be my last.



Kittenforce! aka. telling your users when your instance is down for maintenance

The other day, Scott (check out his blog here) and I were at work chatting about the security trailhead superbadge (specifically, my domain). When you have a custom domain for your salesforce instance, you can customise your login page (or replace it entirely).

I then decided that a would make the login page far better, and hence;

After this, I went to login to a sandbox to do some actual work, only to be greeted with the ‘Please check your username and password. If you still can’t log in, contact your Salesforce administrator.’ message.

I was fairly sure I hadn’t forgotten my password, so I tried it again… nope. same thing.

What I had forgotten, was the fact that the daily deployment to that environment was happening, and as such all users except for the DevOps team were frozen out.

Which got me thinking… If I can put kittens on the login page, then why not some useful information too.

So, that evening I built this;

The concept is fairly simple, when you put an environment into ‘Maintenance’ mode (e.g during a deployment, etc) it freezes all users, excluding a defined list (e.g the DevOps team, system admins) and changes the login page to show a message informing the users of this.

When you are finished and disable maintenance mode, it will unfreeze all users and change the login page message back.

It uses a custom object to store a list of users who were frozen before the environment entered maintenance mode to ensure they stay frozen once the environment is changed back to normal mode.

The actual page itself is hosted from a site, and is configured via a custom setting and custom metadata, which includes allowing them to be override by other pages.

If you would like to try this in your org, click here for the unmanaged package

For installation instructions, see this post.

I would love to hear any feedback you have, feel free to comment below.

Maintenance Mode Package Installation Guide

For more information about this, see this post.



Install the package in the org you wish to try it in (can be a sandbox)
Ensure the package is installed for admins only

Screen Shot 2016-08-08 at 11.54.17

Once installed, go to the ‘Maintenance Config’ tab by clicking the ‘+’ button on the tab bar and click the ‘Perform Initial Setup’ button. You only need to click this once.

Screen Shot 2016-08-03 at 12.44.22

Screen Shot 2016-08-08 at 11.56.42

Once this has completed successfully, you will see the below;

Screen Shot 2016-08-08 at 13.31.00

Next, go to Setup > Develop > Sites

If you have a site already, you can skip this step next two steps

Screen Shot 2016-08-08 at 11.59.19

Enter a domain name you wish to use (remember, you cannot modify this once its been set) and press ‘Check Availability’ Once you have chosen a domain that is avaiable, accept the terms and conditions and press ‘Register My Domain’

Screen Shot 2016-08-08 at 12.01.33

Now, click ‘New’ and configure the site as follows;

Screen Shot 2016-08-08 at 12.02.28

Site Label: Maintenance Site

Active Site Home Page: SFDCMaintLoginSidebar

Default Web Address: maint

Clickjack Protection Level: allow framing by any page

Screen Shot 2016-08-08 at 12.23.57

Accept the defaults for the rest of the fields

If you have an existing site, under ‘Site Visualforce Pages’ click ‘Edit’

Screen Shot 2016-08-08 at 12.06.46

and then add the ‘SFDCMaintLoginSidebar’ page to the ‘Enabled Visualforce Pages’ section

Screen Shot 2016-08-08 at 12.06.59

Ensure you have activated your site when you are finished.

Once you have done this, Setup > Administer > Domain Management > My Domain

Under the ‘Authentication Configuration’ Heading, click ‘Edit’

Screen Shot 2016-08-08 at 12.18.06

Populate the ‘Right Frame URL’ with the URL of your site and press save.

Screen Shot 2016-08-08 at 12.14.25

Once you have completed these steps, logout of salesforce to check your changes have worked.

Make sure you login using your domain (not

Screen Shot 2016-08-08 at 12.25.12 (2)

Once logged in, go to Setup > Develop > Custom Metadata Types

Select ‘Manage Records’ next to ‘Salesforce Maintenance Exempt User’

Screen Shot 2016-08-08 at 13.36.51

Create a new record, and populate the details of the user you want to exclude from Maintenance freezes (e.g DevOps, System Admin)

Screen Shot 2016-08-08 at 13.37.22Once complete, go to the ‘Maintenance Configuration’ tab and press ‘Enable Maintenance Mode’

Screen Shot 2016-08-09 at 13.57.46

Any users who are not exempt will be frozen, and you will a different login page

Screen Shot 2016-08-09 at 13.58.26 (2)


DIY Dynamic DNS with DreamHost


I recently got a new internet connection from Hyperoptic… which is night and day compared to the rubbish Sky ADSL2+ I had before (6/1mbit vs 1000/1000mbit). As such, it is now practical to run servers from my home connection. (for personal use only, if you’re reading this hyperoptic 😉 )

When I was living in Australia and had a decent internet connection, I typically used DynDNS as it was free (it no longer is) and was supported by my router.

Now that DnyDNS isn’t free, nor does my new router support it, I had to find another solution.

I use DreamHost for my web hosting (i.e this site), so I wondered if there was a way to leverage their DNS dynamically.

As it turns out, DreamHost have an API, which has support for creation/deletion/etc of DNS records, and there is some documentation on this on their wiki.

From this, I decided to build my own Dynamic DNS daemon in Java. This daemon will run on my Raspberry Pi3.

The daemon itself is pretty simple, at a set interval it checks my external IP (using this amazon service) and then checks the DNS record for the domain I specify. If it’s the same, nothing is done. If the IP is different, the old record is removed and a new one added. If there is no record there, a new one is added.

It uses a simple JSON file for storing preferences. This version doesn’t have much in the way of logging or error handling… But it was written in a few hours and is very much a work in progress.

If you want to take a look at the source/fork it, it is available on my github here.

If you just want to run it, you can download it from here and then use: java -jar dh-dyn-dns.jar to start it.

Adding and changing a record
Adding and changing a record, as you can see, it’s pretty basic.

If you want to run it as a service on *nix, download this and follow the instructions below; (this service *might* also work on windows, however I haven’t tested it)

  1. You will need a Dreamhost API key, which you can get here, when setting it up, only chose the ‘All DNS functions’ option
  2. Then you will need to install Apache Jsvc, if you are using ubuntu/raspbian/debian/etc run the following command;
    sudo apt-get install Jsvc
  3. Extract the zip to your home directory
  4. Edit the ‘preferences.json’ file to match your confiuration
  5. If you are using a Raspberry Pi, you should be good to go. You can start the service by running the below command;
    ./dh-dyn-dns start
  6. If you are using another configuration, you will need to edit the dh-dyn-dns file and set the correct $JAVA_HOME path.

You can then add the service to init.d (for raspbian/ubuntu/etc) if you want it to start automatically on boot.

Feel free to leave a comment below if you have any questions/comments/ideas 🙂


SOQL to CSV Utility

Recently, I needed to come up with a way for some simple case metrics to be available to users, these metrics need to contain cases that the user may not necessarily have visibility too.

Now, there are a couple of options here;

  • Dashboard
  • Scheduled report
  • Something custom

Now the first two options are the ones I would normally use, both dashboards and report can be scheduled to be run as a user that has access to all cases, and emailed to those who need to see them, dashboards having the added advantage that they can be viewed any time.

But what if you need the data in excel? Well, you could copy/paste from the scheduled email, or get someone to export it for you. But neither are a great solution.

So you could build something custom, which is what I have done here;

Screen Shot 2016-03-14 at 11.46.46 AM

Now, this tool, on its own isn’t much use to your average end user, as a working knowledge of SOQL is required to use it. However, this can very easily be extended to store frequently used ‘exports’ in a custom setting, use custom permissions to grant access to the tool, attach the resulting CSV to a record, or be scheduled to send the csv via email at a set interval using scheduleable apex.

Here is how it works;

The SOQL query entered in the text box is run against the database, returning a list of SObjects. Once we have this list we need to construct a CSV.

The first line of the CSV is the column headings, taken from the text box on the page (or alternatively, the API names of the field entered as part of the query string). For the sake of simplicity, I have simply extracted these fields from the SOQL query (the fields in the list are always in the same order as in the query), however you could use the SObjectDescribe to do fancier things here.

The code then loops through the list of SObjects, and subsequently the list of field names, each loop writing a line of the CSV and then adding a new line at the end

 for(Sobject s :sobjects) {
            for(String fn :fieldNames) {
                csvReport += '"' + s.get(fn) + '"' + ',';
            csvReport += '\r\n';

We then return the CSV as a string to the VisualForce controller, and display it on a page. In order to make the page ‘render as’ a CSV, rather than HTML or text, we need to set the content type attribute of <apex: page>, this also allows us to set the filename of the downloaded CSV.

<apex:page controller="SOQLCSVController" 
contentType="application/csv#{!fileName}.csv"cache="true" showheader="fase">
    <apex:outputText value="{!stringCSV}" />

That really is it, on pressing the Download button, the user is sent to the Download VisualForce page, which renders as a CSV and the browser downloads.

If you wanted to send the generated CSV via email, or attach it to a record you could simply call the Blob class to and attach the resulting blob to an email or record.

The complete code for this example is sitting on my github, feel free to use it and extend it as you wish.

Siri, turn on the light (Part 1)


As you may or may not know, Apple has a technology called HomeKit. Put simply, it allows you to control devices (lights, heating, etc) from your iPhone, and it integrates with Siri.

Several companies make HomeKit enabled devices (e.g Phillips Hue) that you can easily install to take advantage of this… but what is the fun of that?

Luckily, there exists a very good HomeKit Java library, so armed with that, an ESP8266 I set out to see if I could make my own HomeKit enabled devices.

The rough plan was, have the ESP8266 handle the hardware side of things (e.g actually turn things on and off, via a relay) and built a Java application that acts as a HomeKit Bridge.

The first version of the application I built was very basic, it essentially used the example code provided with the library, but one thing I quickly learnt is that HomeKit is like a database, rather than an actual application. You need an app to add entries into the database in order to use it.

Elgato Eve is one such app (which is designed to work with their devices) or there is Home that is designed to work with any HomeKit device (commercial or otherwise). Another option is Apple’s own ‘example code’ called HomeKit Catalog, if you have a Mac with Xcode installed, you can install this on your device (see here for instructions)

I went with HomeKit Catalog, so with that installed and running I was able to test my application. Initially, it worked but would not maintain pairing if the Java application was stopped and re-started, to spite me persisting the data that was required for this. I solved that issue (a method returning weather a device has been paired was not implemented correctly) and was able to move onto the next problem…

How to communicate with the ESP8266 (and actually turn things on and off!) I decided to go with the MQTT protocol for this, its a lightweight and fairly secure publish/subscribe protocol that is often used in low bandwidth/low power applications. The ESP8266 is a very powerful piece of kit, so by using this I have plenty of headroom for other things.

MQTT uses a client/server architecture, meaning I would need a MQTT Broker (server) running, and both my Java application and my ESP8266 would be clients, with the ESP8266 listening for publications from the Java application in order to know when to turn the attached relay on and off. Eclipse make a very good MQTT client library (and application) called Paho that I used in my project. For my MQTT Broker I used Moquette (which I could integrate in to my Java application at a later date)

Next, I had to program the ESP8266. I have done a lot of tinkering in the past with Arduinos, and since you can now program an ESP8266 with the Arduino IDE, I used that. The sketch is fairly simple, it just listens to a topic (A, or B) and then for a value of 1 or 0 (on or off) and sets a GPIO pin high or low, triggering a relay.

So after adding the Paho library to my Java application, I had a means to communicate easily between the Java application and the ESP8266, so time to put it all together and try it out.

And it worked! The Java application is running on my MacBook, so the next step is to get this running on my raspberry pi (and perhaps integrate control of the GPIO pins) as well as support for devices other than simple switches.

I am also working on a design for a ESP8266 relay board that I will talk about in more detail in a future post.

If you want to try this for yourself, the code is available on my github, click here for the Java application and here for the ESP8266 sketch.

A word of warning: working with mains electricity (in this case 240v) is dangerous! Do not attempt this unless you know what you are doing, regardless I take no responsibility for your actions and any damage caused by them.

Australia Day 2015: Slip ‘n Slide + Salesforce

NOTE: This is a previously unpublished post from 2015

When you think of Australia day what would normally come to mind? Celebrating with mates? A BBQ and (often too many) beers? All of these things are pretty standard when it comes to celebrating Australia’s national holiday… but one thing that probably doesn’t come to mind is Salesforce.

For those of you outside of Australia, Australia Day is a national holiday celebrating the foundation of Australia. Like most things here, it’s a pretty relaxed affair. A newer tradition is to listen to the Triple J Hottest 100, a crowd sourced music countdown on Australia’s national alternative radio station. Something else a lot of Aussies do, is setup a backyard slip-n-slide.

So, what does this have to do with Salesforce, you ask. Well… every year a couple of my mates who are electrical engineers host an Australia day party… and being engineers, they couldn’t just settle for any old backyard slip-n-slide… No, their slip-n-slide had a full timing system, leaderboards and a drag racing style Christmas tree! The previous iterations of the slide lacked analytics and a way to track who performed each ride… enter Salesforce.

This year I jumped in on the project and using Salesforce, provided a way to track each person’s runs, display all run-times easily and allow for data from other sources (e.g. the currently playing song on the Triple J Hottest 100) to merge with our data.

Every time a person wanted to use the slide, they’d scan their unique barcode, wait for the traffic light  to go green, and then do their run. The christmas tree would not go green until the course was clear. A series of 6 sensors, placed along the slide run would report data back to a microcontroller (in this case, a teensy 3.0). This data was calculated into times and pushed into Salesforce by a processing sketch. The processing sketch also handled scanning of barcodes and control of the Christmas tree.

Each run down the slide was recorded inside Salesforce, including sector times, total time and speed, with each run being related to the current rider’s contract record as well as the currently playing song on Triple J.

So how did we achieve this? Each time a person scanned their barcode the data was recorded by the processing sketch, and once a run had been completed the barcode number, along with all of the timing data, was pushed in to Salesforce via the processing sketch. As processing is derived from java, It was easy to write a Salesforce connector using existing libraries leveraging the salesforce REST API. This connector created a slide run record in a custom object from this data each time a run was completed.

Upon creation of this record a trigger would fire that called a custom web service class. This web service made a callout to retrieve an XML file that Triple J updated each time a song is played. The class retrieves the file, parses it and stores the currently playing song information against the record.

To tie this all together, a custom app appropriately called ‘Straya Day was created, which consisted of the object tabs and several visualforce tabs to serve as leader boards. This resulted in three main visualforce pages;

  • Last 10 Runs – Displays the last 10 runs down the slide, with the last run in a larger font at the top of the page, as well as the currently playing track on Triple J
  • Top 10 Runs – Displays two tables, the 10 fastest runs by ET and the 10 fastest runs by total time. Also displays the last run and currently playing track in large font at the top of the page.
  • Top 10 by Gender – Displays two tables, 10 fastest runs by girls sorted by total time, and 10 fastest runs by guys sorted by total time. Also displays the last run and currently playing track in large font at the top of the page.

The ‘Straya Day app was then displayed on a laptop, with the page set to refresh every 10 seconds, so that when a rider had completed their run, they could view their time and the currently playing song on the laptop.

Now for some photos.

IMG_20150126_115745 IMG_20150126_115707

The slip and slide itself, running down the side of the house and round into a
catchment area at the end. You can see the yellow timing beams beside the slide.

IMG_20150126_124327 IMG_20150126_125915

The VisualForce page and people checking their recent runs.
IMG_20150126_115907 IMG_20150126_132200

The big mess-o-wires (teensy3, etc) collecting data from the sensors and reporting back to the processing sketch.

If you are keen to get started on something like this yourself? I have placed some example code for the processing sketch, as well as the JJJ fetch web service on my github.