8th August 2022

Exploring Cloudflare Workers

1. Terminology. Cloudflare offers a service called "Cloudflare Pages". It is two things:

  1. Provide ample space for static content: HTML + CSS + images + PDF + etc.
  2. Provide a JavaScript execution environment on the edge

Part #1 alone would be a good thing. I have written on this here Hosting Static Content with Cloudflare. So any static site generator producing static content can target "Cloudflare Pages". There is now this part #2, offering JavaScript on the edge server machine close the final client. The edge server machine is not a single, central server, but rather a flock of servers all over the world. It is there where the JavaScript close to final client is executed. The execution is JavaScript workers in V8. From How Workers works:

Unlike other serverless providers which use containerized processes each running an instance of a language runtime, Workers pays the overhead of a JavaScript runtime once on the start of an edge container. Workers processes are able to run essentially limitless scripts with almost no individual overhead by creating an isolate for each Workers function call. Any given isolate can start around a hundred times faster than a Node process on a container or virtual machine. Notably, on startup isolates consume an order of magnitude less memory.

Cloudflare considered to choose Lua instead, but:

The V8 JavaScript engine is arguably the most scrutinized code sandbox in the history of computing, and the Chrome security team is one of the best in the world. Moreover, Google pays massive bug bounties to anyone who can find a vulnerability. (That said, we have added additional layers of our own sandboxing on top of V8.)


Lua is already deeply integrated into nginx, providing exactly the kind of scripting hooks that we need -- indeed, much of our own business logic running at the edge today is written in Lua. Moreover, Lua already provides facilities for sandboxing. However, in practice, Lua's security as a sandbox has received limited scrutiny, as, historically, there has not been much value in finding a Lua sandbox breakout -- this would change rapidly if we chose it, probably leading to trouble. Moreover, Lua is not very widely known among web developers today.

2. Performance and benchmarks. Cloudflare used clever tricks during HTTPS handshaking to reduce JavaScript cold start time to 0ms.

Cloudflare provides benchmarks that show that "Cloudflare Workers" are indeed fast, see Cloudflare Pages is Lightning Fast:

Pages is built on one of the fastest networks in the world, putting us within 50 ms of 95% of the world’s Internet-connected population. Delivering Pages from this network is the basis of our speed.

See below image for a benchmark comparison:

I am not sure whether the number of points-of-presence is equal to the number of edge servers, but I assume these two values do not differ too much. Cloudflare offers more than 275 points-of-presence near all geographical areas:

Physical nearness to the end client is important, as speed of light adds a very noticable delay. For example, travelling a fourth of the earth circumference needs around 33ms.

$$ {1\over4} 2\pi\times 6.371\,\hbox{km} / (300.000\,\hbox{km}/\hbox{s}) \approx 33 \,\hbox{ms} $$

A simple JavaScript program which could run on the edge is:

addEventListener('fetch', event => {

async function handleRequest(request) {
  return new Response('Hello worker!', { status: 200 });

This will simply return a page showing "Hello worker!" once you call it via <yourWorker>.workers.dev.

3. Concrete example. To mimic the PHP search functionality, which is offered from this web-site, I implemented below "Cloudflare Workers" solution:

  1. Ordinary search is routed to original search on this web-site and there using "real" PHP
  2. The result from the "real" PHP is shown on the workers' URL.
  3. All links in this result page now are relative to this URL and therefore have to be mapped

Below solution routes searchproxy.klm.workers.dev/blog to klm.pages.dev/blog. The same goes for music, gallery, and aux.

async function handleRequest(request) {
  const searchurls = ['https://searchproxy.klm.workers.dev/aux/search.php','http://searchproxy.klm.workers.dev/aux/search.php']
  const blogurls = [

  const stdHeader = { headers: {'content-type': 'text/html;charset=UTF-8'} }
  //const blogstr = 'https://searchproxy.klm.workers.dev/blog/'

  if (!request.url.startsWith(searchurls[0]) && !request.url.startsWith(searchurls[1])) {
    for (let i=0; i<blogurls.length; i+=2) {
      const blogstr = blogurls[i]
      if (request.url.startsWith(blogstr)) {
        const response = await fetch(blogurls[i+1]+request.url.substring(blogstr.length), { method: 'GET', stdHeader });
        const results = await response.text();
        return new Response(results, stdHeader);

  const response = await fetch('https://eklausmeier.goip.de/aux/search.php', {
    method: 'POST',
    headers: {"Content-type": "application/x-www-form-urlencoded; charset=UTF-8"},
    body: request.body  //'searchstr=Cloudflare'
  const results = await response.text(); //gatherResponse(response);
  return new Response(results, stdHeader);

addEventListener('fetch', event => { return event.respondWith(handleRequest(event.request)); });

Above JavaScript code is called once searchproxy.klm.workers.dev is fetched.

4. Alternatives. Vercel and Netlify offer a similar solution as "Cloudflare Workers".

Cloudflare Vercel Netlify
Pages + Workers Edge Functions Edge Functions