SchemaPuker v0.2 Released!

Try the new version right now at https://schemapuker.herokuapp.com/ 

I have been getting a lot of feedback about SchemaPuker since its launch, and many, many people have tried it out
The response has been far more than I expected, with many tweets and even a couple of blog posts about the tool;

Lucidchart + SchemaPuker: The Winning Combination for a Salesforce Consultant
Phil’s Salesforce Tip of the Week #220

I am so glad people are finding the tool useful, I’ve had a few feature requests and bug reports, which is why I have now released a new version, with the following changes;

  • You can now select if you want all fields displayed, or only relationship fields
  • Much better error handling!
    • Before, if something went wrong, you’d either get an ugly error page, or nothing at all, now you will get some (hopefully) useful details if something goes wrong
  • Huge speed increase, up to 5.9x faster in my super scientific benchmark*
  • All relationships should now be visible, some users were reporting that the lines connecting them didn’t show in lucidchart
    • I threw my entire dev org at it, and was able to see all the relationship lines automatically, if you are still experiencing this issue please let me know!
  • Minor text fixes

I have had suggestions for more new features, which I do plan to include in future releases, so please keep them coming!

If you have any suggestions, comments, bugs or need help you can send me a tweet, leave a comment, or send me a message!

* Super scientifc benchmark method: timing the old and new method several times and working out the average difference

Why I love/hate custom metadata types: Introducing Meta Dataloader

A semi-recent feature of salesforce is Custom Metadata Types. The are like custom settings, but better in many ways.

One of these is very important…  they are deployable! Just like any other piece of metadata (fields, objects, classes, etc) Anyone who has ever dealt with custom settings before, knows what a gigantic pain in the ass it is to keep environments in sync.

However, they have some limitations… While they can be accessed from within apex classes, unlike custom settings they cannot be modified programmatically (well, they can but its not that easy).

Also, unlike custom settings, there is no easy way to populate them in bulk (e.g via workbench, dataloader, etc). Salesforce do give you an option, but it kind of sucks (it involves deploying code to your org, etc, etc)

Faced with having to load ~200 custom metadata type records, and not wanting add an app to my org when I didn’t have to. I decided to write a tool instead.

Presenting: Meta Dataloader!

https://meta-dataloader.herokuapp.com/

This is a similar tool to SchemaPuker (infact, i reused a LOT of the code from it) that performs one specific task, it can take a CSV and create custom metadata type records from it.

Once you’ve logged in, you simply choose the Metadata type you wish to load records in to, and if you want to upsert or delete records.

screen-shot-2016-12-02-at-5-41-04-pm

You then need to upload a csv of the values you wish to load, with the headings matching the field API names (similar to workbench)

screen-shot-2016-12-02-at-5-44-27-pm

Click submit, and the records will be upserted or deleted

screen-shot-2016-12-02-at-5-38-28-pm

The tool is pretty basic, but it solves a problem. It took me ~3 hours to put together, so it may have issues.

If you find it useful, let me know, and likewise let me know if you find any bugs.

The code for this is available on my github

SchemaPuker: How it came to be

If you haven’t seen my post about SchemaPuker, check it out here.

The story begins last year, when a colleague of mine David Everitt built a handy tool for generating ERDs. It was essentially a visualforce page / controller that allowed you to choose objects and then it would output some text in the format of a PostgreSQL Schema file that you could then import in to Lucidchart.

PostgreSQL schema files are relatively easy to generate (as they are essentially plain text) and Lucidchart was the diagramming tool of choice where we worked, so this all made sense.

I saw this, and thought it was a brilliant idea. ERDs are something that are very often part of design documents, proposals, etc. Even if you are building new functionality, often you are using some, or all of the existing data model, so having a way to get this out of salesforce easily was very helpful.

You can read more about David’s tool at his blog, SlightlyTechnical, including how to try it yourself.

However, a visualforce page / apex class has its limitations.

  • If you were doing a discovery, perhaps you don’t have credentials to organisation you need to chart, or if you do perhaps you don’t have a sandbox, or permission to install anything in one
  • If you do have credentials and a sandbox, you then need to add the visualforce page and controller in to the org
  • It would just output the results into the page itself, making it harder to import into your charting tool

So I decided I would make a new version of the tool, plus it was a good excuse to play with the salesforce metadata API, which I hadn’t had a lot of exposure to at the time.

I decided I would throw together a Java application to do this, I had written plenty of little console based apps in the past, but had never done anything with a GUI, so this was yet another learning opportunity. I built the app using swing, the force.com WSC, utilising the metadata API and the SOAP API to handle authentication.

The application worked fine and had all the same functionality as its visualforce counterpart, with the added bonus that it would generate a text file, rather than display the output. After that, I got busy with life and forgot about it all.

This year, after giving my blog a bit of a refresh and thinking about what I could write about, when I remembered the tool. I dug out the source code, looked at it, cringed and thought about how I could make this thing better.

The obvious solution here was a cloud based app. Something that required no installation or setup, and was easy to use. Given that I already had a my previous iteration written in Java (and Java is the language I am most comfortable with) heroku seemed like the best fit for hosting this.

Life got in the way again, and it wasn’t till after a trip to surfforce (see my writeup here) and a discussion with Dave Carroll from salesforce that I thought about it again.

Dave was telling me about the work he had done on the force.com cli, and the plans to extend the tool. I told him about my at-the-time named ‘Salesforce ERD Tool’ I was planning to move to heroku. He suggested (quite rightly) that that was a rather boring name, and came up with the idea of calling it ‘SchemaPuker’, and the name was born.

After surfforce I decided I would tackle this. I had never written a java web-app, nor had I used a web framework or deployed anything to heroku before. So with yet another great learning opportunity I set about learning how to do this.

I chose Spring MVC as my framework, mostly due to the huge amount of documentation for it, its uncanny similarity to visualforce and Spring Boot, which made testing the app locally *really* easy, and allowed for no xml config files.

I decided I was going to use the salesforce lightning design system in for the UI of my application, it looks nice and there is an excellent guide available for it.

Next, was taking a look at authorisation. My previous tool used the SOAP API for authorisation, however this was not going to be suitable here. Using OAuth2 made much more sense (so much so that I made a post about it here).

 

Once I had authorisation sorted out, I was able to reuse most of the core of my original application, and once I had the UI tidied up, I had a minimum viable product. I do have some ideas for enhancements for the next version, such as graphical output, stored groups of objects and a better interface for choosing objects.

SchemaPuker: ERDs made easy

SchemaPuker can be accessed here: https://schemapuker.herokuapp.com/

Read on for more information about SchemaPuker!

Often, we need to produce diagrams of our organisation’s data model (aka. ERDs). This will be especially true for those of us who are consultants.

Perhaps you are doing a discovery or analysis and need a a copy of the current data model, or maybe you need a ‘current state’ and a ‘to be’ for comparison, or you are designing new functionality that connects with an existing data model, or documenting functionality after completion.

Now, salesforce does have a tool to visualise the data model, called Schema Builder, however this cannot export the model, nor can it be customised without actually changing the data model itself.

To solve this problem, I came up with… SchemaPuker! (thanks to David Carroll for the name! and to David Everitt for the idea in the first place!) For more about how it came to be, and the name click here

But for now, SchemaPuker is a fairly simple tool, It allows you to authorise to salesforce, get a list of your objects and export them as a PostgreSQL schema file. This file can be imported in to Lucidchart (and other tools) in order to generate an editable ERD.

The tool itself is very simple to use, first, navigate to https://schemapuker.herokuapp.com, choose if you are using a Production/Developer Org or a Sandbox and click ‘Login’. You will then be asked to enter your salesforce credentials and to authorise SchemaPuker to access your org.

Screen Shot 2016-09-01 at 16.36.36

Once authorised, you will be given a list of objects inside your salesforce org. You then select the objects you wish to be in your ERD by holding down command (or crtl on windows/linux) and clicking, or by typing the API names in the ‘Selected Objects’ box

sp2

Once you click submit, you are given the PostgreSQL Schema. You can either copy/paste this into lucid chard, or click the ‘Download’ button below the output.

sp3

Next, log in to Lucidchart and create a new drawing, click ‘More Shapes’ at the bottom and then tick ‘Entity Relationship’ and press ‘Save’

lucid1

Now, you can either import the downloaded file from SchemaPuker by pressing ‘Choose File’, or paste the output in to the box below. You can ignore steps one and two in the import window.

lucid2

You will now see your salesforce objects in the sidebar just under the ‘Entity Relationship’ panel. You can drag the objects on and the relationships between the objects will be automatically created.

lucid3

You can click add new shapes from the ‘Entity Relationship’ panel to extend your ERD as required.

Thats it! Please try it out and let me know how you go!

Please Note: This is still very much beta, and is ‘minimum viable product’. However I am working to improve it on a regular basis, and would love to hear your thoughts.
It is limited to ~30 objects per export and may crash in fun and exciting ways. The app does *not* store any data, nor does it make *any* changes to your salesforce org.

Fun with OAuth2

OAuth2 is a magical thing, it makes it *very* easy for users to login to your application without sharing their credentials with it. The actual authorisation of the user is handed over to the service they are authenticating against (e.g Facebook, Twitter, Salesforce) and you are given an ‘access token’ which which you can make requests to the service with. For more on OAuth, there is a good explainer here.

At the moment, I am working on an application that I hope will be useful for some of you. This application needs to authenticate to salesforce in order to use it’s APIs.

The last time I did salesforce auth, I used the Login/Password/Token method via the SOAP API. This method works, but it’s not ideal for a webapp. It’s fairly clunky, requires my app to handle the actual credentials and usually needs a token. It has huge the potential to be insecure and is a bad user experience.

So after much looking around, trying, failing, goolging, etc I finally found something brilliant…. The Scribe library. It handles the actual OAuth bits, this allows my login code to be very, very tiny.

The next piece of the puzzle is what to do with the returned JSON, unfortunately the Scribe library struggles to parse it. In order to access the APIs I am using the Force.com WSC, which uses a ‘ConnectorConfig’ object to pass authentication details when it makes calls. So I needed a way to take the JSON returned from OAuth and return a ‘ConnectorConfig’ object that I can use with the WSC.

This was actually pretty straightforward, I simply serialize the JSON to an object using the Google GSON library and construct the ‘ConnectorConfig’ from the result.

Once I have a connector config, I can make API calls with the WSC and build the rest of my application. I hope that if someone is in the same boat as I was last week that this post helps them out.

Feel free to leave any comments below 🙂

Salesforce Community Events – Surfforce

If you’re in the salesforce space, no doubt you have heard of some of their events. The biggest and most well known being Dreamforce. Perhaps you’ve been to a Dreamforce, or world tour or one of the other official salesforce.com events.

But perhaps something you didn’t know about were salesforce ‘Community Events’. These are events that are not run by salesforce.com themselves, rather, they are organised by the community (often sponsored by salesforce.com partners, ISVs, etc). Community Events are relatively new in the space, but they are picking up pace quickly, an excellent example of this was the London’s Calling event here in the UK (that I unfortunately didn’t make it to.. next year!)

So why am I talking about community events? Well, I went to my first one recently – Surfforce.

Surfforce was billed as ‘a salesforce user group with a difference’ and it certainly was. Held in Aberavon, Wales, the basic idea of the event was ‘lets go for a surf in the morning, then talk salesforce in the afternoon’. It was the brainchild of Shaun Holmes, who’s passion for both helping others, the community and salesforce is incredible.

The event was aimed at people new to the salesforce community, with several excellent speakers sharing their journeys within the salesforce world. As well as this, there was a focus on helping local charaties.

I only found out about the event about a week before, so lucky enough I was able to organise the trip down with Scott. Given my late coming to the party, I was not able to secure a spot in the surfing portion of the day, which suited me fine, I am from Australia after all and the water temperate in wales was a little different to what I am used to!

When those brave enough to strap on a wetsuit where finished in the ocean it was time for lunch, networking and chatting with sponsors.

surfforce-agenda

After lunch, we were treated to some excellent talks. First was Danielle from the wave project, she took us through what the project was about, and the amazing impact that it has had on the kids in need who were able to take part. They are doing excellent work with kids in need, providing ‘surf therapy’, teaching them how to surf and helping them with mental health issues such as anxiey and depression. Thanks to surfforce, over £500 was raised to help them in their efforts, as well as the opportunity for 15 kids to take part in a surf lesson at the same time as the surfforce attendees.

We next heard from Anna, a local businesswoman and entrepreneur, who spoke of her humble beginnings in Poland during the cold war, and how she was able to make the most of what she had and how she was able to keep challenging herself to be better and better. She has won multiple awards and is CEO of two successful companies, her talk was definitely inspiring.

We then heard from Dave and Mike from salesforce, both of them very early employees, they gave a very informative presentation that went through the journey salesforce as a company has been on, from having a handful of customers in 1999, none of whom had to pay for licensing for the first year! (interesting side note, one of these early adopters was a previous employer of mine) to launching the AppExchange, running on a box under dave’s desk, and originally called the App Store (sound familiar?) to the multi billion dollar success they are today. This was a very interesting talk, and if you get a change to see/watch it I would highly recommend it.

After a brief break for lunch, we heard from several more excellent speakers. The first of whom, Louise spoke of her personal journey from someone who had no experience with salesforce (or computers really, she has a background in Literature) to becoming an Awesome Admin. She spoke of how she found that she had more of an interest in the systems she was working with, than the actual work itself, and how that when she has salesforce ‘forced’ upon her, she decided that she would learn as much as possible and make a go of it. Louise described how much the salesforce community has been a help to her, the sheer volume of resources out there and how inclusive and helpful people were.

Next up, Antonia took us through her journey to bring her to the position she is in today (Lead Consultant) and how that her journey and the salesforce community is anything but boring. She explained that using the salesforce platform, anyone who wants to try can become a developer thanks to the supportive community, excellent declarative tools and wealth of documentation.

Finally, Jodi spoke to us about her journey from Salesforce Admin to Consultant, with a presentation entirely of GIFs (no death by powerpoint here!) she spoke of how she was constantly looking for new challenges, from being an Administrator, to setting up a Centre of Excellence, to finally making the move into the consulting world. Her journey in particular is one that I think a lot of people in the salesforce consultancy world will be familiar with (I know I am, I started my salesforce journey as an Admin back in 2008)

Proceeding ended with drinks and networking. To spite not being the exact target audience, I think that I got quite a lot from attending Surfforce, I met a load of amazing people and got to be involved with, what I think, is an excellent concept.

Shaun, Kerry, the speakers, volunteers, sponsors and everyone else who worked so hard to get this event up and running deserve a huge pat on the back for what they achieved with this event. I think the Surfforce concept would fit in perfectly back home in Australia. Something like this could be easily done in both the Gold Coast and Sydney, and given the salesforce community in Australia, could be very successful. I hope that someone seriously considers this concept, and that the next event is even bigger and more successful than the last.

I think the concept of community events is a great one, it goes to show how inclusive the salesforce community is as a whole and how excited people are about the platform. Surfforce may have been my first community event, but it most definitely won’t be my last.

 

 

Kittenforce! aka. telling your users when your instance is down for maintenance

The other day, Scott (check out his blog here) and I were at work chatting about the security trailhead superbadge (specifically, my domain). When you have a custom domain for your salesforce instance, you can customise your login page (or replace it entirely).

I then decided that a would make the login page far better, and hence;

After this, I went to login to a sandbox to do some actual work, only to be greeted with the ‘Please check your username and password. If you still can’t log in, contact your Salesforce administrator.’ message.

I was fairly sure I hadn’t forgotten my password, so I tried it again… nope. same thing.

What I had forgotten, was the fact that the daily deployment to that environment was happening, and as such all users except for the DevOps team were frozen out.

Which got me thinking… If I can put kittens on the login page, then why not some useful information too.

So, that evening I built this;

The concept is fairly simple, when you put an environment into ‘Maintenance’ mode (e.g during a deployment, etc) it freezes all users, excluding a defined list (e.g the DevOps team, system admins) and changes the login page to show a message informing the users of this.

When you are finished and disable maintenance mode, it will unfreeze all users and change the login page message back.

It uses a custom object to store a list of users who were frozen before the environment entered maintenance mode to ensure they stay frozen once the environment is changed back to normal mode.

The actual page itself is hosted from a force.com site, and is configured via a custom setting and custom metadata, which includes allowing them to be override by other pages.

If you would like to try this in your org, click here for the unmanaged package

For installation instructions, see this post.

I would love to hear any feedback you have, feel free to comment below.

Maintenance Mode Package Installation Guide

For more information about this, see this post.

Prerequisites

Instructions

Install the package in the org you wish to try it in (can be a sandbox)
Ensure the package is installed for admins only

Screen Shot 2016-08-08 at 11.54.17

Once installed, go to the ‘Maintenance Config’ tab by clicking the ‘+’ button on the tab bar and click the ‘Perform Initial Setup’ button. You only need to click this once.

Screen Shot 2016-08-03 at 12.44.22

Screen Shot 2016-08-08 at 11.56.42

Once this has completed successfully, you will see the below;

Screen Shot 2016-08-08 at 13.31.00

Next, go to Setup > Develop > Sites

If you have a force.com site already, you can skip this step next two steps

Screen Shot 2016-08-08 at 11.59.19

Enter a domain name you wish to use (remember, you cannot modify this once its been set) and press ‘Check Availability’ Once you have chosen a domain that is avaiable, accept the terms and conditions and press ‘Register My Force.com Domain’

Screen Shot 2016-08-08 at 12.01.33

Now, click ‘New’ and configure the site as follows;

Screen Shot 2016-08-08 at 12.02.28

Site Label: Maintenance Site

Active Site Home Page: SFDCMaintLoginSidebar

Default Web Address: maint

Clickjack Protection Level: allow framing by any page

Screen Shot 2016-08-08 at 12.23.57

Accept the defaults for the rest of the fields

If you have an existing force.com site, under ‘Site Visualforce Pages’ click ‘Edit’

Screen Shot 2016-08-08 at 12.06.46

and then add the ‘SFDCMaintLoginSidebar’ page to the ‘Enabled Visualforce Pages’ section

Screen Shot 2016-08-08 at 12.06.59

Ensure you have activated your force.com site when you are finished.

Once you have done this, Setup > Administer > Domain Management > My Domain

Under the ‘Authentication Configuration’ Heading, click ‘Edit’

Screen Shot 2016-08-08 at 12.18.06

Populate the ‘Right Frame URL’ with the URL of your force.com site and press save.

Screen Shot 2016-08-08 at 12.14.25

Once you have completed these steps, logout of salesforce to check your changes have worked.

Make sure you login using your domain (not login.salesforce.com)

Screen Shot 2016-08-08 at 12.25.12 (2)

Once logged in, go to Setup > Develop > Custom Metadata Types

Select ‘Manage Records’ next to ‘Salesforce Maintenance Exempt User’

Screen Shot 2016-08-08 at 13.36.51

Create a new record, and populate the details of the user you want to exclude from Maintenance freezes (e.g DevOps, System Admin)

Screen Shot 2016-08-08 at 13.37.22Once complete, go to the ‘Maintenance Configuration’ tab and press ‘Enable Maintenance Mode’

Screen Shot 2016-08-09 at 13.57.46

Any users who are not exempt will be frozen, and you will a different login page

Screen Shot 2016-08-09 at 13.58.26 (2)

 

DIY Dynamic DNS with DreamHost

dns

I recently got a new internet connection from Hyperoptic… which is night and day compared to the rubbish Sky ADSL2+ I had before (6/1mbit vs 1000/1000mbit). As such, it is now practical to run servers from my home connection. (for personal use only, if you’re reading this hyperoptic 😉 )

When I was living in Australia and had a decent internet connection, I typically used DynDNS as it was free (it no longer is) and was supported by my router.

Now that DnyDNS isn’t free, nor does my new router support it, I had to find another solution.

I use DreamHost for my web hosting (i.e this site), so I wondered if there was a way to leverage their DNS dynamically.

As it turns out, DreamHost have an API, which has support for creation/deletion/etc of DNS records, and there is some documentation on this on their wiki.

From this, I decided to build my own Dynamic DNS daemon in Java. This daemon will run on my Raspberry Pi3.

The daemon itself is pretty simple, at a set interval it checks my external IP (using this amazon service) and then checks the DNS record for the domain I specify. If it’s the same, nothing is done. If the IP is different, the old record is removed and a new one added. If there is no record there, a new one is added.

It uses a simple JSON file for storing preferences. This version doesn’t have much in the way of logging or error handling… But it was written in a few hours and is very much a work in progress.

If you want to take a look at the source/fork it, it is available on my github here.

If you just want to run it, you can download it from here and then use: java -jar dh-dyn-dns.jar to start it.

Adding and changing a record
Adding and changing a record, as you can see, it’s pretty basic.

If you want to run it as a service on *nix, download this and follow the instructions below; (this service *might* also work on windows, however I haven’t tested it)

  1. You will need a Dreamhost API key, which you can get here, when setting it up, only chose the ‘All DNS functions’ option
  2. Then you will need to install Apache Jsvc, if you are using ubuntu/raspbian/debian/etc run the following command;
    sudo apt-get install Jsvc
  3. Extract the zip to your home directory
  4. Edit the ‘preferences.json’ file to match your confiuration
  5. If you are using a Raspberry Pi, you should be good to go. You can start the service by running the below command;
    ./dh-dyn-dns start
  6. If you are using another configuration, you will need to edit the dh-dyn-dns file and set the correct $JAVA_HOME path.

You can then add the service to init.d (for raspbian/ubuntu/etc) if you want it to start automatically on boot.

Feel free to leave a comment below if you have any questions/comments/ideas 🙂

 

SOQL to CSV Utility

Recently, I needed to come up with a way for some simple case metrics to be available to users, these metrics need to contain cases that the user may not necessarily have visibility too.

Now, there are a couple of options here;

  • Dashboard
  • Scheduled report
  • Something custom

Now the first two options are the ones I would normally use, both dashboards and report can be scheduled to be run as a user that has access to all cases, and emailed to those who need to see them, dashboards having the added advantage that they can be viewed any time.

But what if you need the data in excel? Well, you could copy/paste from the scheduled email, or get someone to export it for you. But neither are a great solution.

So you could build something custom, which is what I have done here;

Screen Shot 2016-03-14 at 11.46.46 AM

Now, this tool, on its own isn’t much use to your average end user, as a working knowledge of SOQL is required to use it. However, this can very easily be extended to store frequently used ‘exports’ in a custom setting, use custom permissions to grant access to the tool, attach the resulting CSV to a record, or be scheduled to send the csv via email at a set interval using scheduleable apex.

Here is how it works;

The SOQL query entered in the text box is run against the database, returning a list of SObjects. Once we have this list we need to construct a CSV.

The first line of the CSV is the column headings, taken from the text box on the page (or alternatively, the API names of the field entered as part of the query string). For the sake of simplicity, I have simply extracted these fields from the SOQL query (the fields in the list are always in the same order as in the query), however you could use the SObjectDescribe to do fancier things here.

The code then loops through the list of SObjects, and subsequently the list of field names, each loop writing a line of the CSV and then adding a new line at the end

1
2
3
4
5
6
 for(Sobject s :sobjects) {
            for(String fn :fieldNames) {
                csvReport += '"' + s.get(fn) + '"' + ',';
            }
            csvReport += '\r\n';
        }

We then return the CSV as a string to the VisualForce controller, and display it on a page. In order to make the page ‘render as’ a CSV, rather than HTML or text, we need to set the content type attribute of <apex: page>, this also allows us to set the filename of the downloaded CSV.

1
2
3
4
<apex:page controller="SOQLCSVController" 
contentType="application/csv#{!fileName}.csv"cache="true" showheader="fase">
    <apex:outputText value="{!stringCSV}" />
</apex:page>

That really is it, on pressing the Download button, the user is sent to the Download VisualForce page, which renders as a CSV and the browser downloads.

If you wanted to send the generated CSV via email, or attach it to a record you could simply call the Blob class to and attach the resulting blob to an email or record.

The complete code for this example is sitting on my github, feel free to use it and extend it as you wish.