SchemaPuker v0.3 Released!

After far too long between, I’ve finally had a chance to release a new version of SchemaPuker

This release contains the following changes;

  • Objects are now shown with both their label and API name in the select list, entering objects in the text box is still done by API name only as before.
  • An issue with some custom object relationships not being drawn has now been resolved
  • Error handling has been further enhanced

It is live now at:, so please give it a go!

As always, if you have any suggestions, comments, bugs or need help you can send me a tweet, leave a comment, or send me a message!


Organising Surf Force 2017: The best salesforce adventure you will have this year!

As you may (or may not be…) aware I am part of the team organising Surf Force.

What is Surf Force you ask?

Well it’s a salesforce community event, but not like any other that you may have been to. Through surfing, we encourage you to take a chance and to step out of your comfort zone.

Surfing is something that not a lot of people have done, and that people might find scary or challenging… But when you have people around you who are there to guild you and help, you will realise it wasn’t so hard after all.

This is a lesson that we can apply to the salesforce community, and the community at large. We can all step out of our comfort zones, learn something new, do something great, and help others. Surf Force is here to prove this to you, teach you new things and empower you to do this.

I helped with Surf Force in 2016 (Which was held in Aberavon, Wales) and loved the concept and what the founder, Shaun Holmes was trying to achieve. Shaun’s enthusiasm for the event, and helping others was inspiring and I knew that in 2017 that I had to be part of it and help to make it bigger and better!

Organising Surf Force 2017

Organising an event takes a lot of hard work, even more so when everyone has day jobs and their own lives to live. All of the team work full time and have varying family and other commitments, and to make things even more challenging, we are holding the event in a different country!

To spite the challenges, the team of Shaun Holmes, Kerry Townsend, Scott Gassmann, Jenny Bamber, Lauren Touyet and myself have made amazing progress on making Surf Force and we had our first trip to Bundoran, Ireland to scope out the venue for this years event, talk to local contacts and charities and, of course, go for a surf!

If you’ve never been to Bundoran (or to Ireland in general) then you are missing out, it is an absolutely gorgeous place and the people there are incredibly friendly.

The venue we have chosen for Surf Force 2017 is the Great Northern Hotel, which is right on the beach had has some excellent facilities for the event, as well as for leisure (pool, spa, sauna, golf course, etc)

We also met up with the amazing people at the Donegal Adventure Centre, who will be providing the surfing lessons and all of the kit required. The organises and instructors there are amazing and really make sure that you are both having a good time, learning and being safe.

I am very excited to be a part of this event and to work with the amazing group of people who are organising it and I hope that you all will come along. I also wish to thank our sponsors, who help to make events like this possible. So please check out Taskfeed and Good Day Sir!

To find out more about Surf Force, visit the website here, follow us on twitter, instagram or facebook!


Generating multiple documents programmatically in Salesforce

A colleague recently came to me with a ‘problem’ that he was scratching his head about.

His requirement was to generate multiple documents (PDFs in this case) from data stored in varying objects in salesforce, which he needed to be zipped and attached to an object or otherwise able to be downloaded.

My initial answer to him was simple, just install Conga and be done with it. Unfortunately, as this particular organisation is unable to use anything that was hosted on AWS (I know… ) Conga was out.

So after thinking a little bit more, I remembered that, thanks to the PageReference class, you can ‘access’ visualforce pages programmatically (amongst other things), and store the resulting output in a Blob.

For example, lets say you have a simple visualforce page that displays some information from an account record.


Account Summary for {! Account.Name }


In this example, we will generate some of these ‘Account Summary’ PDFs for a given list of accounts. Its very simple really;

//some accounts for this example
List acts = [ SELECT Id, Phone, Fax, Website, Description, Name FROM Account LIMIT 10];
//the resulting list of blobs containing the generated pdfs
List generatedPdfs = new List();
  for(Account a :acts) {        
    //PageReference for the visualforce page we wish to use
    PageReference pdf = Page.Account;
    //provide it with the require parameters
    pdf.getParameters().put('Id', a.Id);
    //access it and store it as a blob
    Blob b = pdf.getContent();

Now we have a blob of each page, and bear in mind that they don’t all have to be the same page, I am simply using a loop to generate multiple PDFs without having to write a bunch of visualforce pages for this example.

With those blobs, we can do a few things. We could post them to chatter, attach them to a record, or post them to a content library.

We could also, using a very cool ‘library’ I found called ‘Zippex‘ we could zip them all up, then post to the resulting zip to chatter, content, attachments, etc.

This isn’t just for PDFs. Using the contentType attribute of visualforce, you could output a bunch of CSVs, or other document types (see here for some more on this) and zip/attach them as well.

Some things to bear in mind;

  • If your visualforce pages perform SOQL, looping through them may cause you to hit query limits
  • Generating lots of documents will cause you to hit the heap size limit
  • Zipping lots of documents may cause you to hit the CPU limit
  • There is a reason that apps like conga handle this off platform.

    However, if you’ve got some existing visualforce pages, can accept these limitations and need a way to generate and attach documents without a tool like Conga, this is an option for you.

    Here is a link to some more example code on my github


    SchemaPuker v0.2 Released!

    Try the new version right now at 

    I have been getting a lot of feedback about SchemaPuker since its launch, and many, many people have tried it out
    The response has been far more than I expected, with many tweets and even a couple of blog posts about the tool;

    Lucidchart + SchemaPuker: The Winning Combination for a Salesforce Consultant
    Phil’s Salesforce Tip of the Week #220

    I am so glad people are finding the tool useful, I’ve had a few feature requests and bug reports, which is why I have now released a new version, with the following changes;

    • You can now select if you want all fields displayed, or only relationship fields
    • Much better error handling!
      • Before, if something went wrong, you’d either get an ugly error page, or nothing at all, now you will get some (hopefully) useful details if something goes wrong
    • Huge speed increase, up to 5.9x faster in my super scientific benchmark*
    • All relationships should now be visible, some users were reporting that the lines connecting them didn’t show in lucidchart
      • I threw my entire dev org at it, and was able to see all the relationship lines automatically, if you are still experiencing this issue please let me know!
    • Minor text fixes

    I have had suggestions for more new features, which I do plan to include in future releases, so please keep them coming!

    If you have any suggestions, comments, bugs or need help you can send me a tweet, leave a comment, or send me a message!

    * Super scientifc benchmark method: timing the old and new method several times and working out the average difference

    Why I love/hate custom metadata types: Introducing Meta Dataloader

    A semi-recent feature of salesforce is Custom Metadata Types. The are like custom settings, but better in many ways.

    One of these is very important…  they are deployable! Just like any other piece of metadata (fields, objects, classes, etc) Anyone who has ever dealt with custom settings before, knows what a gigantic pain in the ass it is to keep environments in sync.

    However, they have some limitations… While they can be accessed from within apex classes, unlike custom settings they cannot be modified programmatically (well, they can but its not that easy).

    Also, unlike custom settings, there is no easy way to populate them in bulk (e.g via workbench, dataloader, etc). Salesforce do give you an option, but it kind of sucks (it involves deploying code to your org, etc, etc)

    Faced with having to load ~200 custom metadata type records, and not wanting add an app to my org when I didn’t have to. I decided to write a tool instead.

    Presenting: Meta Dataloader!

    This is a similar tool to SchemaPuker (infact, i reused a LOT of the code from it) that performs one specific task, it can take a CSV and create custom metadata type records from it.

    Once you’ve logged in, you simply choose the Metadata type you wish to load records in to, and if you want to upsert or delete records.


    You then need to upload a csv of the values you wish to load, with the headings matching the field API names (similar to workbench)


    Click submit, and the records will be upserted or deleted


    The tool is pretty basic, but it solves a problem. It took me ~3 hours to put together, so it may have issues.

    If you find it useful, let me know, and likewise let me know if you find any bugs.

    The code for this is available on my github


    SchemaPuker: How it came to be

    If you haven’t seen my post about SchemaPuker, check it out here.

    The story begins last year, when a colleague of mine David Everitt built a handy tool for generating ERDs. It was essentially a visualforce page / controller that allowed you to choose objects and then it would output some text in the format of a PostgreSQL Schema file that you could then import in to Lucidchart.

    PostgreSQL schema files are relatively easy to generate (as they are essentially plain text) and Lucidchart was the diagramming tool of choice where we worked, so this all made sense.

    I saw this, and thought it was a brilliant idea. ERDs are something that are very often part of design documents, proposals, etc. Even if you are building new functionality, often you are using some, or all of the existing data model, so having a way to get this out of salesforce easily was very helpful.

    You can read more about David’s tool at his blog, SlightlyTechnical, including how to try it yourself.

    However, a visualforce page / apex class has its limitations.

    • If you were doing a discovery, perhaps you don’t have credentials to organisation you need to chart, or if you do perhaps you don’t have a sandbox, or permission to install anything in one
    • If you do have credentials and a sandbox, you then need to add the visualforce page and controller in to the org
    • It would just output the results into the page itself, making it harder to import into your charting tool

    So I decided I would make a new version of the tool, plus it was a good excuse to play with the salesforce metadata API, which I hadn’t had a lot of exposure to at the time.

    I decided I would throw together a Java application to do this, I had written plenty of little console based apps in the past, but had never done anything with a GUI, so this was yet another learning opportunity. I built the app using swing, the WSC, utilising the metadata API and the SOAP API to handle authentication.

    The application worked fine and had all the same functionality as its visualforce counterpart, with the added bonus that it would generate a text file, rather than display the output. After that, I got busy with life and forgot about it all.

    This year, after giving my blog a bit of a refresh and thinking about what I could write about, when I remembered the tool. I dug out the source code, looked at it, cringed and thought about how I could make this thing better.

    The obvious solution here was a cloud based app. Something that required no installation or setup, and was easy to use. Given that I already had a my previous iteration written in Java (and Java is the language I am most comfortable with) heroku seemed like the best fit for hosting this.

    Life got in the way again, and it wasn’t till after a trip to surfforce (see my writeup here) and a discussion with Dave Carroll from salesforce that I thought about it again.

    Dave was telling me about the work he had done on the cli, and the plans to extend the tool. I told him about my at-the-time named ‘Salesforce ERD Tool’ I was planning to move to heroku. He suggested (quite rightly) that that was a rather boring name, and came up with the idea of calling it ‘SchemaPuker’, and the name was born.

    After surfforce I decided I would tackle this. I had never written a java web-app, nor had I used a web framework or deployed anything to heroku before. So with yet another great learning opportunity I set about learning how to do this.

    I chose Spring MVC as my framework, mostly due to the huge amount of documentation for it, its uncanny similarity to visualforce and Spring Boot, which made testing the app locally *really* easy, and allowed for no xml config files.

    I decided I was going to use the salesforce lightning design system in for the UI of my application, it looks nice and there is an excellent guide available for it.

    Next, was taking a look at authorisation. My previous tool used the SOAP API for authorisation, however this was not going to be suitable here. Using OAuth2 made much more sense (so much so that I made a post about it here).


    Once I had authorisation sorted out, I was able to reuse most of the core of my original application, and once I had the UI tidied up, I had a minimum viable product. I do have some ideas for enhancements for the next version, such as graphical output, stored groups of objects and a better interface for choosing objects.


    SchemaPuker: ERDs made easy

    SchemaPuker can be accessed here:

    Read on for more information about SchemaPuker!

    Often, we need to produce diagrams of our organisation’s data model (aka. ERDs). This will be especially true for those of us who are consultants.

    Perhaps you are doing a discovery or analysis and need a a copy of the current data model, or maybe you need a ‘current state’ and a ‘to be’ for comparison, or you are designing new functionality that connects with an existing data model, or documenting functionality after completion.

    Now, salesforce does have a tool to visualise the data model, called Schema Builder, however this cannot export the model, nor can it be customised without actually changing the data model itself.

    To solve this problem, I came up with… SchemaPuker! (thanks to David Carroll for the name! and to David Everitt for the idea in the first place!) For more about how it came to be, and the name click here

    But for now, SchemaPuker is a fairly simple tool, It allows you to authorise to salesforce, get a list of your objects and export them as a PostgreSQL schema file. This file can be imported in to Lucidchart (and other tools) in order to generate an editable ERD.

    The tool itself is very simple to use, first, navigate to, choose if you are using a Production/Developer Org or a Sandbox and click ‘Login’. You will then be asked to enter your salesforce credentials and to authorise SchemaPuker to access your org.

    Screen Shot 2016-09-01 at 16.36.36

    Once authorised, you will be given a list of objects inside your salesforce org. You then select the objects you wish to be in your ERD by holding down command (or crtl on windows/linux) and clicking, or by typing the API names in the ‘Selected Objects’ box


    Once you click submit, you are given the PostgreSQL Schema. You can either copy/paste this into lucid chard, or click the ‘Download’ button below the output.


    Next, log in to Lucidchart and create a new drawing, click ‘More Shapes’ at the bottom and then tick ‘Entity Relationship’ and press ‘Save’


    Now, you can either import the downloaded file from SchemaPuker by pressing ‘Choose File’, or paste the output in to the box below. You can ignore steps one and two in the import window.


    You will now see your salesforce objects in the sidebar just under the ‘Entity Relationship’ panel. You can drag the objects on and the relationships between the objects will be automatically created.


    You can click add new shapes from the ‘Entity Relationship’ panel to extend your ERD as required.

    Thats it! Please try it out and let me know how you go!

    Please Note: This is still very much beta, and is ‘minimum viable product’. However I am working to improve it on a regular basis, and would love to hear your thoughts.
    It is limited to ~30 objects per export and may crash in fun and exciting ways. The app does *not* store any data, nor does it make *any* changes to your salesforce org.

    Fun with OAuth2

    OAuth2 is a magical thing, it makes it *very* easy for users to login to your application without sharing their credentials with it. The actual authorisation of the user is handed over to the service they are authenticating against (e.g Facebook, Twitter, Salesforce) and you are given an ‘access token’ which which you can make requests to the service with. For more on OAuth, there is a good explainer here.

    At the moment, I am working on an application that I hope will be useful for some of you. This application needs to authenticate to salesforce in order to use it’s APIs.

    The last time I did salesforce auth, I used the Login/Password/Token method via the SOAP API. This method works, but it’s not ideal for a webapp. It’s fairly clunky, requires my app to handle the actual credentials and usually needs a token. It has huge the potential to be insecure and is a bad user experience.

    So after much looking around, trying, failing, goolging, etc I finally found something brilliant…. The Scribe library. It handles the actual OAuth bits, this allows my login code to be very, very tiny.

    The next piece of the puzzle is what to do with the returned JSON, unfortunately the Scribe library struggles to parse it. In order to access the APIs I am using the WSC, which uses a ‘ConnectorConfig’ object to pass authentication details when it makes calls. So I needed a way to take the JSON returned from OAuth and return a ‘ConnectorConfig’ object that I can use with the WSC.

    This was actually pretty straightforward, I simply serialize the JSON to an object using the Google GSON library and construct the ‘ConnectorConfig’ from the result.

    Once I have a connector config, I can make API calls with the WSC and build the rest of my application. I hope that if someone is in the same boat as I was last week that this post helps them out.

    Feel free to leave any comments below 🙂


    Salesforce Community Events – Surfforce

    If you’re in the salesforce space, no doubt you have heard of some of their events. The biggest and most well known being Dreamforce. Perhaps you’ve been to a Dreamforce, or world tour or one of the other official events.

    But perhaps something you didn’t know about were salesforce ‘Community Events’. These are events that are not run by themselves, rather, they are organised by the community (often sponsored by partners, ISVs, etc). Community Events are relatively new in the space, but they are picking up pace quickly, an excellent example of this was the London’s Calling event here in the UK (that I unfortunately didn’t make it to.. next year!)

    So why am I talking about community events? Well, I went to my first one recently – Surfforce.

    Surfforce was billed as ‘a salesforce user group with a difference’ and it certainly was. Held in Aberavon, Wales, the basic idea of the event was ‘lets go for a surf in the morning, then talk salesforce in the afternoon’. It was the brainchild of Shaun Holmes, who’s passion for both helping others, the community and salesforce is incredible.

    The event was aimed at people new to the salesforce community, with several excellent speakers sharing their journeys within the salesforce world. As well as this, there was a focus on helping local charaties.

    I only found out about the event about a week before, so lucky enough I was able to organise the trip down with Scott. Given my late coming to the party, I was not able to secure a spot in the surfing portion of the day, which suited me fine, I am from Australia after all and the water temperate in wales was a little different to what I am used to!

    When those brave enough to strap on a wetsuit where finished in the ocean it was time for lunch, networking and chatting with sponsors.


    After lunch, we were treated to some excellent talks. First was Danielle from the wave project, she took us through what the project was about, and the amazing impact that it has had on the kids in need who were able to take part. They are doing excellent work with kids in need, providing ‘surf therapy’, teaching them how to surf and helping them with mental health issues such as anxiey and depression. Thanks to surfforce, over £500 was raised to help them in their efforts, as well as the opportunity for 15 kids to take part in a surf lesson at the same time as the surfforce attendees.

    We next heard from Anna, a local businesswoman and entrepreneur, who spoke of her humble beginnings in Poland during the cold war, and how she was able to make the most of what she had and how she was able to keep challenging herself to be better and better. She has won multiple awards and is CEO of two successful companies, her talk was definitely inspiring.

    We then heard from Dave and Mike from salesforce, both of them very early employees, they gave a very informative presentation that went through the journey salesforce as a company has been on, from having a handful of customers in 1999, none of whom had to pay for licensing for the first year! (interesting side note, one of these early adopters was a previous employer of mine) to launching the AppExchange, running on a box under dave’s desk, and originally called the App Store (sound familiar?) to the multi billion dollar success they are today. This was a very interesting talk, and if you get a change to see/watch it I would highly recommend it.

    After a brief break for lunch, we heard from several more excellent speakers. The first of whom, Louise spoke of her personal journey from someone who had no experience with salesforce (or computers really, she has a background in Literature) to becoming an Awesome Admin. She spoke of how she found that she had more of an interest in the systems she was working with, than the actual work itself, and how that when she has salesforce ‘forced’ upon her, she decided that she would learn as much as possible and make a go of it. Louise described how much the salesforce community has been a help to her, the sheer volume of resources out there and how inclusive and helpful people were.

    Next up, Antonia took us through her journey to bring her to the position she is in today (Lead Consultant) and how that her journey and the salesforce community is anything but boring. She explained that using the salesforce platform, anyone who wants to try can become a developer thanks to the supportive community, excellent declarative tools and wealth of documentation.

    Finally, Jodi spoke to us about her journey from Salesforce Admin to Consultant, with a presentation entirely of GIFs (no death by powerpoint here!) she spoke of how she was constantly looking for new challenges, from being an Administrator, to setting up a Centre of Excellence, to finally making the move into the consulting world. Her journey in particular is one that I think a lot of people in the salesforce consultancy world will be familiar with (I know I am, I started my salesforce journey as an Admin back in 2008)

    Proceeding ended with drinks and networking. To spite not being the exact target audience, I think that I got quite a lot from attending Surfforce, I met a load of amazing people and got to be involved with, what I think, is an excellent concept.

    Shaun, Kerry, the speakers, volunteers, sponsors and everyone else who worked so hard to get this event up and running deserve a huge pat on the back for what they achieved with this event. I think the Surfforce concept would fit in perfectly back home in Australia. Something like this could be easily done in both the Gold Coast and Sydney, and given the salesforce community in Australia, could be very successful. I hope that someone seriously considers this concept, and that the next event is even bigger and more successful than the last.

    I think the concept of community events is a great one, it goes to show how inclusive the salesforce community is as a whole and how excited people are about the platform. Surfforce may have been my first community event, but it most definitely won’t be my last.




    Kittenforce! aka. telling your users when your instance is down for maintenance

    The other day, Scott (check out his blog here) and I were at work chatting about the security trailhead superbadge (specifically, my domain). When you have a custom domain for your salesforce instance, you can customise your login page (or replace it entirely).

    I then decided that a would make the login page far better, and hence;

    After this, I went to login to a sandbox to do some actual work, only to be greeted with the ‘Please check your username and password. If you still can’t log in, contact your Salesforce administrator.’ message.

    I was fairly sure I hadn’t forgotten my password, so I tried it again… nope. same thing.

    What I had forgotten, was the fact that the daily deployment to that environment was happening, and as such all users except for the DevOps team were frozen out.

    Which got me thinking… If I can put kittens on the login page, then why not some useful information too.

    So, that evening I built this;

    The concept is fairly simple, when you put an environment into ‘Maintenance’ mode (e.g during a deployment, etc) it freezes all users, excluding a defined list (e.g the DevOps team, system admins) and changes the login page to show a message informing the users of this.

    When you are finished and disable maintenance mode, it will unfreeze all users and change the login page message back.

    It uses a custom object to store a list of users who were frozen before the environment entered maintenance mode to ensure they stay frozen once the environment is changed back to normal mode.

    The actual page itself is hosted from a site, and is configured via a custom setting and custom metadata, which includes allowing them to be override by other pages.

    If you would like to try this in your org, click here for the unmanaged package

    For installation instructions, see this post.

    I would love to hear any feedback you have, feel free to comment below.