Organising Surf Force 2017: The best salesforce adventure you will have this year!

As you may (or may not be…) aware I am part of the team organising Surf Force.

What is Surf Force you ask?

Well it’s a salesforce community event, but not like any other that you may have been to. Through surfing, we encourage you to take a chance and to step out of your comfort zone.

Surfing is something that not a lot of people have done, and that people might find scary or challenging… But when you have people around you who are there to guild you and help, you will realise it wasn’t so hard after all.

This is a lesson that we can apply to the salesforce community, and the community at large. We can all step out of our comfort zones, learn something new, do something great, and help others. Surf Force is here to prove this to you, teach you new things and empower you to do this.

I helped with Surf Force in 2016 (Which was held in Aberavon, Wales) and loved the concept and what the founder, Shaun Holmes was trying to achieve. Shaun’s enthusiasm for the event, and helping others was inspiring and I knew that in 2017 that I had to be part of it and help to make it bigger and better!

Organising Surf Force 2017

Organising an event takes a lot of hard work, even more so when everyone has day jobs and their own lives to live. All of the team work full time and have varying family and other commitments, and to make things even more challenging, we are holding the event in a different country!

To spite the challenges, the team of Shaun Holmes, Kerry Townsend, Scott Gassmann, Jenny Bamber, Lauren Touyet and myself have made amazing progress on making Surf Force and we had our first trip to Bundoran, Ireland to scope out the venue for this years event, talk to local contacts and charities and, of course, go for a surf!

If you’ve never been to Bundoran (or to Ireland in general) then you are missing out, it is an absolutely gorgeous place and the people there are incredibly friendly.

The venue we have chosen for Surf Force 2017 is the Great Northern Hotel, which is right on the beach had has some excellent facilities for the event, as well as for leisure (pool, spa, sauna, golf course, etc)

We also met up with the amazing people at the Donegal Adventure Centre, who will be providing the surfing lessons and all of the kit required. The organises and instructors there are amazing and really make sure that you are both having a good time, learning and being safe.

I am very excited to be a part of this event and to work with the amazing group of people who are organising it and I hope that you all will come along. I also wish to thank our sponsors, who help to make events like this possible. So please check out Taskfeed and Good Day Sir!

To find out more about Surf Force, visit the website here, follow us on twitter, instagram or facebook!

Generating multiple documents programmatically in Salesforce

A colleague recently came to me with a ‘problem’ that he was scratching his head about.

His requirement was to generate multiple documents (PDFs in this case) from data stored in varying objects in salesforce, which he needed to be zipped and attached to an object or otherwise able to be downloaded.

My initial answer to him was simple, just install Conga and be done with it. Unfortunately, as this particular organisation is unable to use anything that was hosted on AWS (I know… ) Conga was out.

So after thinking a little bit more, I remembered that, thanks to the PageReference class, you can ‘access’ visualforce pages programmatically (amongst other things), and store the resulting output in a Blob.

For example, lets say you have a simple visualforce page that displays some information from an account record.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
<apex:page standardController="Account" standardStylesheets="false" showHeader="false" sidebar="false" renderAs="PDF">
    <html>
        <head>
        </head>
        <body>
 
            <h1>Account Summary for {! Account.Name }</h1>
 
            <table>
                <tr><th>Phone</th>  <td><apex:outputText value="{! Account.Phone }"/></td></tr>
                <tr><th>Fax</th>    <td><apex:outputText value="{! Account.Fax }"/></td></tr>
                <tr><th>Website</th><td><apex:outputText value="{! Account.Website }"/></td></tr>
            </table>
 
            <p><apex:outputText value="{! Account.Description }"/></p>
        </body>
    </html>
 
</apex:page>

In this example, we will generate some of these ‘Account Summary’ PDFs for a given list of accounts. Its very simple really;

1
2
3
4
5
6
7
8
9
10
11
12
13
14
//some accounts for this example
List acts = [ SELECT Id, Phone, Fax, Website, Description, Name FROM Account LIMIT 10];
//the resulting list of blobs containing the generated pdfs
List generatedPdfs = new List();
 
  for(Account a :acts) {        
    //PageReference for the visualforce page we wish to use
    PageReference pdf = Page.Account;
    //provide it with the require parameters
    pdf.getParameters().put('Id', a.Id);
    //access it and store it as a blob
    Blob b = pdf.getContent();
    generatedPdfs.add(b);
}

Now we have a blob of each page, and bear in mind that they don’t all have to be the same page, I am simply using a loop to generate multiple PDFs without having to write a bunch of visualforce pages for this example.

With those blobs, we can do a few things. We could post them to chatter, attach them to a record, or post them to a content library.

We could also, using a very cool ‘library’ I found called ‘Zippex‘ we could zip them all up, then post to the resulting zip to chatter, content, attachments, etc.

This isn’t just for PDFs. Using the contentType attribute of visualforce, you could output a bunch of CSVs, or other document types (see here for some more on this) and zip/attach them as well.

Some things to bear in mind;

  • If your visualforce pages perform SOQL, looping through them may cause you to hit query limits
  • Generating lots of documents will cause you to hit the heap size limit
  • Zipping lots of documents may cause you to hit the CPU limit
  • There is a reason that apps like conga handle this off platform.

    However, if you’ve got some existing visualforce pages, can accept these limitations and need a way to generate and attach documents without a tool like Conga, this is an option for you.

    Here is a link to some more example code on my github

    SchemaPuker v0.2 Released!

    Try the new version right now at https://schemapuker.herokuapp.com/ 

    I have been getting a lot of feedback about SchemaPuker since its launch, and many, many people have tried it out
    The response has been far more than I expected, with many tweets and even a couple of blog posts about the tool;

    Lucidchart + SchemaPuker: The Winning Combination for a Salesforce Consultant
    Phil’s Salesforce Tip of the Week #220

    I am so glad people are finding the tool useful, I’ve had a few feature requests and bug reports, which is why I have now released a new version, with the following changes;

    • You can now select if you want all fields displayed, or only relationship fields
    • Much better error handling!
      • Before, if something went wrong, you’d either get an ugly error page, or nothing at all, now you will get some (hopefully) useful details if something goes wrong
    • Huge speed increase, up to 5.9x faster in my super scientific benchmark*
    • All relationships should now be visible, some users were reporting that the lines connecting them didn’t show in lucidchart
      • I threw my entire dev org at it, and was able to see all the relationship lines automatically, if you are still experiencing this issue please let me know!
    • Minor text fixes

    I have had suggestions for more new features, which I do plan to include in future releases, so please keep them coming!

    If you have any suggestions, comments, bugs or need help you can send me a tweet, leave a comment, or send me a message!

    * Super scientifc benchmark method: timing the old and new method several times and working out the average difference

    SchemaPuker: ERDs made easy

    SchemaPuker can be accessed here: https://schemapuker.herokuapp.com/

    Read on for more information about SchemaPuker!

    Often, we need to produce diagrams of our organisation’s data model (aka. ERDs). This will be especially true for those of us who are consultants.

    Perhaps you are doing a discovery or analysis and need a a copy of the current data model, or maybe you need a ‘current state’ and a ‘to be’ for comparison, or you are designing new functionality that connects with an existing data model, or documenting functionality after completion.

    Now, salesforce does have a tool to visualise the data model, called Schema Builder, however this cannot export the model, nor can it be customised without actually changing the data model itself.

    To solve this problem, I came up with… SchemaPuker! (thanks to David Carroll for the name! and to David Everitt for the idea in the first place!) For more about how it came to be, and the name click here

    But for now, SchemaPuker is a fairly simple tool, It allows you to authorise to salesforce, get a list of your objects and export them as a PostgreSQL schema file. This file can be imported in to Lucidchart (and other tools) in order to generate an editable ERD.

    The tool itself is very simple to use, first, navigate to https://schemapuker.herokuapp.com, choose if you are using a Production/Developer Org or a Sandbox and click ‘Login’. You will then be asked to enter your salesforce credentials and to authorise SchemaPuker to access your org.

    Screen Shot 2016-09-01 at 16.36.36

    Once authorised, you will be given a list of objects inside your salesforce org. You then select the objects you wish to be in your ERD by holding down command (or crtl on windows/linux) and clicking, or by typing the API names in the ‘Selected Objects’ box

    sp2

    Once you click submit, you are given the PostgreSQL Schema. You can either copy/paste this into lucid chard, or click the ‘Download’ button below the output.

    sp3

    Next, log in to Lucidchart and create a new drawing, click ‘More Shapes’ at the bottom and then tick ‘Entity Relationship’ and press ‘Save’

    lucid1

    Now, you can either import the downloaded file from SchemaPuker by pressing ‘Choose File’, or paste the output in to the box below. You can ignore steps one and two in the import window.

    lucid2

    You will now see your salesforce objects in the sidebar just under the ‘Entity Relationship’ panel. You can drag the objects on and the relationships between the objects will be automatically created.

    lucid3

    You can click add new shapes from the ‘Entity Relationship’ panel to extend your ERD as required.

    Thats it! Please try it out and let me know how you go!

    Please Note: This is still very much beta, and is ‘minimum viable product’. However I am working to improve it on a regular basis, and would love to hear your thoughts.
    It is limited to ~30 objects per export and may crash in fun and exciting ways. The app does *not* store any data, nor does it make *any* changes to your salesforce org.

    Fun with OAuth2

    OAuth2 is a magical thing, it makes it *very* easy for users to login to your application without sharing their credentials with it. The actual authorisation of the user is handed over to the service they are authenticating against (e.g Facebook, Twitter, Salesforce) and you are given an ‘access token’ which which you can make requests to the service with. For more on OAuth, there is a good explainer here.

    At the moment, I am working on an application that I hope will be useful for some of you. This application needs to authenticate to salesforce in order to use it’s APIs.

    The last time I did salesforce auth, I used the Login/Password/Token method via the SOAP API. This method works, but it’s not ideal for a webapp. It’s fairly clunky, requires my app to handle the actual credentials and usually needs a token. It has huge the potential to be insecure and is a bad user experience.

    So after much looking around, trying, failing, goolging, etc I finally found something brilliant…. The Scribe library. It handles the actual OAuth bits, this allows my login code to be very, very tiny.

    The next piece of the puzzle is what to do with the returned JSON, unfortunately the Scribe library struggles to parse it. In order to access the APIs I am using the Force.com WSC, which uses a ‘ConnectorConfig’ object to pass authentication details when it makes calls. So I needed a way to take the JSON returned from OAuth and return a ‘ConnectorConfig’ object that I can use with the WSC.

    This was actually pretty straightforward, I simply serialize the JSON to an object using the Google GSON library and construct the ‘ConnectorConfig’ from the result.

    Once I have a connector config, I can make API calls with the WSC and build the rest of my application. I hope that if someone is in the same boat as I was last week that this post helps them out.

    Feel free to leave any comments below 🙂

    Salesforce Community Events – Surfforce

    If you’re in the salesforce space, no doubt you have heard of some of their events. The biggest and most well known being Dreamforce. Perhaps you’ve been to a Dreamforce, or world tour or one of the other official salesforce.com events.

    But perhaps something you didn’t know about were salesforce ‘Community Events’. These are events that are not run by salesforce.com themselves, rather, they are organised by the community (often sponsored by salesforce.com partners, ISVs, etc). Community Events are relatively new in the space, but they are picking up pace quickly, an excellent example of this was the London’s Calling event here in the UK (that I unfortunately didn’t make it to.. next year!)

    So why am I talking about community events? Well, I went to my first one recently – Surfforce.

    Surfforce was billed as ‘a salesforce user group with a difference’ and it certainly was. Held in Aberavon, Wales, the basic idea of the event was ‘lets go for a surf in the morning, then talk salesforce in the afternoon’. It was the brainchild of Shaun Holmes, who’s passion for both helping others, the community and salesforce is incredible.

    The event was aimed at people new to the salesforce community, with several excellent speakers sharing their journeys within the salesforce world. As well as this, there was a focus on helping local charaties.

    I only found out about the event about a week before, so lucky enough I was able to organise the trip down with Scott. Given my late coming to the party, I was not able to secure a spot in the surfing portion of the day, which suited me fine, I am from Australia after all and the water temperate in wales was a little different to what I am used to!

    When those brave enough to strap on a wetsuit where finished in the ocean it was time for lunch, networking and chatting with sponsors.

    surfforce-agenda

    After lunch, we were treated to some excellent talks. First was Danielle from the wave project, she took us through what the project was about, and the amazing impact that it has had on the kids in need who were able to take part. They are doing excellent work with kids in need, providing ‘surf therapy’, teaching them how to surf and helping them with mental health issues such as anxiey and depression. Thanks to surfforce, over £500 was raised to help them in their efforts, as well as the opportunity for 15 kids to take part in a surf lesson at the same time as the surfforce attendees.

    We next heard from Anna, a local businesswoman and entrepreneur, who spoke of her humble beginnings in Poland during the cold war, and how she was able to make the most of what she had and how she was able to keep challenging herself to be better and better. She has won multiple awards and is CEO of two successful companies, her talk was definitely inspiring.

    We then heard from Dave and Mike from salesforce, both of them very early employees, they gave a very informative presentation that went through the journey salesforce as a company has been on, from having a handful of customers in 1999, none of whom had to pay for licensing for the first year! (interesting side note, one of these early adopters was a previous employer of mine) to launching the AppExchange, running on a box under dave’s desk, and originally called the App Store (sound familiar?) to the multi billion dollar success they are today. This was a very interesting talk, and if you get a change to see/watch it I would highly recommend it.

    After a brief break for lunch, we heard from several more excellent speakers. The first of whom, Louise spoke of her personal journey from someone who had no experience with salesforce (or computers really, she has a background in Literature) to becoming an Awesome Admin. She spoke of how she found that she had more of an interest in the systems she was working with, than the actual work itself, and how that when she has salesforce ‘forced’ upon her, she decided that she would learn as much as possible and make a go of it. Louise described how much the salesforce community has been a help to her, the sheer volume of resources out there and how inclusive and helpful people were.

    Next up, Antonia took us through her journey to bring her to the position she is in today (Lead Consultant) and how that her journey and the salesforce community is anything but boring. She explained that using the salesforce platform, anyone who wants to try can become a developer thanks to the supportive community, excellent declarative tools and wealth of documentation.

    Finally, Jodi spoke to us about her journey from Salesforce Admin to Consultant, with a presentation entirely of GIFs (no death by powerpoint here!) she spoke of how she was constantly looking for new challenges, from being an Administrator, to setting up a Centre of Excellence, to finally making the move into the consulting world. Her journey in particular is one that I think a lot of people in the salesforce consultancy world will be familiar with (I know I am, I started my salesforce journey as an Admin back in 2008)

    Proceeding ended with drinks and networking. To spite not being the exact target audience, I think that I got quite a lot from attending Surfforce, I met a load of amazing people and got to be involved with, what I think, is an excellent concept.

    Shaun, Kerry, the speakers, volunteers, sponsors and everyone else who worked so hard to get this event up and running deserve a huge pat on the back for what they achieved with this event. I think the Surfforce concept would fit in perfectly back home in Australia. Something like this could be easily done in both the Gold Coast and Sydney, and given the salesforce community in Australia, could be very successful. I hope that someone seriously considers this concept, and that the next event is even bigger and more successful than the last.

    I think the concept of community events is a great one, it goes to show how inclusive the salesforce community is as a whole and how excited people are about the platform. Surfforce may have been my first community event, but it most definitely won’t be my last.

     

     

    Kittenforce! aka. telling your users when your instance is down for maintenance

    The other day, Scott (check out his blog here) and I were at work chatting about the security trailhead superbadge (specifically, my domain). When you have a custom domain for your salesforce instance, you can customise your login page (or replace it entirely).

    I then decided that a would make the login page far better, and hence;

    After this, I went to login to a sandbox to do some actual work, only to be greeted with the ‘Please check your username and password. If you still can’t log in, contact your Salesforce administrator.’ message.

    I was fairly sure I hadn’t forgotten my password, so I tried it again… nope. same thing.

    What I had forgotten, was the fact that the daily deployment to that environment was happening, and as such all users except for the DevOps team were frozen out.

    Which got me thinking… If I can put kittens on the login page, then why not some useful information too.

    So, that evening I built this;

    The concept is fairly simple, when you put an environment into ‘Maintenance’ mode (e.g during a deployment, etc) it freezes all users, excluding a defined list (e.g the DevOps team, system admins) and changes the login page to show a message informing the users of this.

    When you are finished and disable maintenance mode, it will unfreeze all users and change the login page message back.

    It uses a custom object to store a list of users who were frozen before the environment entered maintenance mode to ensure they stay frozen once the environment is changed back to normal mode.

    The actual page itself is hosted from a force.com site, and is configured via a custom setting and custom metadata, which includes allowing them to be override by other pages.

    If you would like to try this in your org, click here for the unmanaged package

    For installation instructions, see this post.

    I would love to hear any feedback you have, feel free to comment below.

    SOQL to CSV Utility

    Recently, I needed to come up with a way for some simple case metrics to be available to users, these metrics need to contain cases that the user may not necessarily have visibility too.

    Now, there are a couple of options here;

    • Dashboard
    • Scheduled report
    • Something custom

    Now the first two options are the ones I would normally use, both dashboards and report can be scheduled to be run as a user that has access to all cases, and emailed to those who need to see them, dashboards having the added advantage that they can be viewed any time.

    But what if you need the data in excel? Well, you could copy/paste from the scheduled email, or get someone to export it for you. But neither are a great solution.

    So you could build something custom, which is what I have done here;

    Screen Shot 2016-03-14 at 11.46.46 AM

    Now, this tool, on its own isn’t much use to your average end user, as a working knowledge of SOQL is required to use it. However, this can very easily be extended to store frequently used ‘exports’ in a custom setting, use custom permissions to grant access to the tool, attach the resulting CSV to a record, or be scheduled to send the csv via email at a set interval using scheduleable apex.

    Here is how it works;

    The SOQL query entered in the text box is run against the database, returning a list of SObjects. Once we have this list we need to construct a CSV.

    The first line of the CSV is the column headings, taken from the text box on the page (or alternatively, the API names of the field entered as part of the query string). For the sake of simplicity, I have simply extracted these fields from the SOQL query (the fields in the list are always in the same order as in the query), however you could use the SObjectDescribe to do fancier things here.

    The code then loops through the list of SObjects, and subsequently the list of field names, each loop writing a line of the CSV and then adding a new line at the end

    1
    2
    3
    4
    5
    6
    
     for(Sobject s :sobjects) {
                for(String fn :fieldNames) {
                    csvReport += '"' + s.get(fn) + '"' + ',';
                }
                csvReport += '\r\n';
            }

    We then return the CSV as a string to the VisualForce controller, and display it on a page. In order to make the page ‘render as’ a CSV, rather than HTML or text, we need to set the content type attribute of <apex: page>, this also allows us to set the filename of the downloaded CSV.

    1
    2
    3
    4
    
    <apex:page controller="SOQLCSVController" 
    contentType="application/csv#{!fileName}.csv"cache="true" showheader="fase">
        <apex:outputText value="{!stringCSV}" />
    </apex:page>

    That really is it, on pressing the Download button, the user is sent to the Download VisualForce page, which renders as a CSV and the browser downloads.

    If you wanted to send the generated CSV via email, or attach it to a record you could simply call the Blob class to and attach the resulting blob to an email or record.

    The complete code for this example is sitting on my github, feel free to use it and extend it as you wish.