There’s a lot of talk about whether or not to use libraries because of performance issues. When you just load the library once, it probably doesn’t matter, but if you are using Html service, every time a server function is called, the server script is initialized – loading all the source and the libraries again. So certainly in theory, this feels like it should be an important discriminator.

There are certainly other reasons for not using libraries in add-ons, such as creating unnecessary dependencies on libraries which might disappear – so I think the advice is absolutely solid.

Is performance really the problem ?

It’s called out as a massive no-no in the Add-on documentation like this. But is loading from libraries really a lot worse than uploading the same amount of source code as part of the script ?


Also, since google.script.run calls are asynchronous, you can run many of them at the same time – but the other question I wanted to find the answer to was whether it becomes counterproductive with a certain number of calls in progress – do multiple parallel calls interfere with each other?

So I decided to test all that.

The results

Findings

In summary this means that I find no real penalty for loading libraries. In fact, with parallelism the library versions seem to perform a little better than when all the code is local. I can’t explain it but it seems consistent.

The chart shows the results of 3 flavors of test.

  • Where all the source was local in a container bound project. I copied in the source of 3 chunky libraries into the source code.
  • Where the the libraries were accessed as libraries from a container bound project.
  • Where the the libraries were accessed as libraries from an add-on.

For each example above I called a server function 100 times using various levels of parallelism. That means I allowed 1,5,10 and 20 instances of the server call to run at the same time in each of the tests. The timings are the average round trip time for a single call to the server functions (the light blue, purple and green) and the yellow, red and blue are the overall time to run the 100 calls.

The server functions did nothing, aside from returning to the client so the round tip time could be measured.

Waking up the server function should have provoked all the code to be loaded from libraries or locally each time. Since these calls were asynchronous, I did multiple simultaneous server calls to see the effect of that too. I discovered that the average round trip call for more than 10 parallel threads outweighs the elapsed time benefit for running more things at the same time. So between 5 and 10 seems about the optimum number of threads to be trying to manage. If the server was doing something more substantial, its possible that this balance could change.

Server time versus client time.

As little wrinkle, I noticed that the server time is out of synch with the client time by about half a second. Originally I had planned to use the difference between the time the server stared running its function

and the time the client requested it as the measurement (receivedByServer – initiatedByClient), to remove some of the variability an internet connection might introduce

but because of this out of sync problem I used the complete round trip time

as the receivedByServer was actually later than the subsequent receivedByClient, meaning that the server time is running a little ahead of the client time.

The code for the addon version of the test follows if you want to play around with it, and is also on github.

The code

You can load whichever libraries you like to test. However it needs the cUseful library (copied locally for the local test, or used as a library for the library test). For details, see A functional approach to fiddling with sheet data. This follows my usual pattern for a sidebar app/spreadsheet add-on. It’s also on github.

Addon.gs

Server.gs

Client.gs

main.js.html

styles.css.html

<

index.html

App.gs

For more like this see Google Apps Scripts Snippets
Why not join our forum, follow the blog or follow me on Twitter to ensure you get updates when they are available.