Limit the number of goroutines running at the same time

Standard

Recently, I was working on package that was doing network requests inside goroutines and I encountered an issue: the program was really fast to finish, but the results were awful. This was because the number of goroutines running at the same time was too high. As a result, the network was congested, too many sockets were opened on my laptop and the final performance was degraded: requests were slow or failing.

In order to keep the network healthy while maintaining some concurrency, I wanted to limit the number of goroutines making requests at the same time. Here is a sample main file to illustrate how you can control the maximum number of goroutines that are allowed to run concurrently.

package main

import (
	"flag"
	"fmt"
	"time"
)

// Fake a long and difficult work.
func DoWork() {
	time.Sleep(500 * time.Millisecond)
}

func main() {
	maxNbConcurrentGoroutines := flag.Int("maxNbConcurrentGoroutines", 5, "the number of goroutines that are allowed to run concurrently")
	nbJobs := flag.Int("nbJobs", 100, "the number of jobs that we need to do")
	flag.Parse()

	// Dummy channel to coordinate the number of concurrent goroutines.
	// This channel should be buffered otherwise we will be immediately blocked
	// when trying to fill it.
	concurrentGoroutines := make(chan struct{}, *maxNbConcurrentGoroutines)
	// Fill the dummy channel with maxNbConcurrentGoroutines empty struct.
	for i := 0; i < *maxNbConcurrentGoroutines; i++ {
		concurrentGoroutines <- struct{}{}
	}

	// The done channel indicates when a single goroutine has
	// finished its job.
	done := make(chan bool)
	// The waitForAllJobs channel allows the main program
	// to wait until we have indeed done all the jobs.
	waitForAllJobs := make(chan bool)

	// Collect all the jobs, and since the job is finished, we can
	// release another spot for a goroutine.
	go func() {
		for i := 0; i < *nbJobs; i++ {
			<-done
			// Say that another goroutine can now start.
			concurrentGoroutines <- struct{}{}
		}
		// We have collected all the jobs, the program
		// can now terminate
		waitForAllJobs <- true
	}()

	// Try to start nbJobs jobs
	for i := 1; i <= *nbJobs; i++ {
		fmt.Printf("ID: %v: waiting to launch!\n", i)
		// Try to receive from the concurrentGoroutines channel. When we have something,
		// it means we can start a new goroutine because another one finished.
		// Otherwise, it will block the execution until an execution
		// spot is available.
		<-concurrentGoroutines
		fmt.Printf("ID: %v: it's my turn!\n", i)
		go func(id int) {
			DoWork()
			fmt.Printf("ID: %v: all done!\n", id)
			done <- true
		}(i)
	}

	// Wait for all jobs to finish
	<-waitForAllJobs
}

This file is available as a gist on GitHub if you find it more convenient.

Sample runs

For the command time go run concurrent.go -nbJobs 25 -maxNbConcurrentGoroutines 10:

ID: 1: waiting to launch!
ID: 1: it's my turn!
ID: 2: waiting to launch!
ID: 2: it's my turn!
ID: 3: waiting to launch!
ID: 3: it's my turn!
ID: 4: waiting to launch!
ID: 4: it's my turn!
ID: 5: waiting to launch!
ID: 5: it's my turn!
ID: 6: waiting to launch!
ID: 6: it's my turn!
ID: 7: waiting to launch!
ID: 7: it's my turn!
ID: 8: waiting to launch!
ID: 8: it's my turn!
ID: 9: waiting to launch!
ID: 9: it's my turn!
ID: 10: waiting to launch!
ID: 10: it's my turn!
ID: 11: waiting to launch!
ID: 1: all done!
ID: 9: all done!
ID: 11: it's my turn!
ID: 12: waiting to launch!
ID: 12: it's my turn!
ID: 7: all done!
ID: 13: waiting to launch!
ID: 5: all done!
ID: 13: it's my turn!
ID: 14: waiting to launch!
ID: 4: all done!
ID: 14: it's my turn!
ID: 8: all done!
ID: 15: waiting to launch!
ID: 15: it's my turn!
ID: 16: waiting to launch!
ID: 16: it's my turn!
ID: 10: all done!
ID: 17: waiting to launch!
ID: 2: all done!
ID: 17: it's my turn!
ID: 18: waiting to launch!
ID: 18: it's my turn!
ID: 3: all done!
ID: 19: waiting to launch!
ID: 6: all done!
ID: 19: it's my turn!
ID: 20: waiting to launch!
ID: 20: it's my turn!
ID: 21: waiting to launch!
ID: 20: all done!
ID: 16: all done!
ID: 17: all done!
ID: 12: all done!
ID: 21: it's my turn!
ID: 19: all done!
ID: 11: all done!
ID: 14: all done!
ID: 18: all done!
ID: 15: all done!
ID: 13: all done!
ID: 22: waiting to launch!
ID: 22: it's my turn!
ID: 23: waiting to launch!
ID: 23: it's my turn!
ID: 24: waiting to launch!
ID: 24: it's my turn!
ID: 25: waiting to launch!
ID: 25: it's my turn!
ID: 24: all done!
ID: 21: all done!
ID: 22: all done!
ID: 25: all done!
ID: 23: all done!
0,28s user 0,05s system 18% cpu 1,762 total

For the command time go run concurrent.go -nbJobs 10 -maxNbConcurrentGoroutines 1:

ID: 1: waiting to launch!
ID: 1: it's my turn!
ID: 2: waiting to launch!
ID: 1: all done!
ID: 2: it's my turn!
ID: 3: waiting to launch!
ID: 2: all done!
ID: 3: it's my turn!
ID: 4: waiting to launch!
ID: 3: all done!
ID: 4: it's my turn!
ID: 5: waiting to launch!
ID: 4: all done!
ID: 5: it's my turn!
ID: 6: waiting to launch!
ID: 5: all done!
ID: 6: it's my turn!
ID: 7: waiting to launch!
ID: 6: all done!
ID: 7: it's my turn!
ID: 8: waiting to launch!
ID: 7: all done!
ID: 8: it's my turn!
ID: 9: waiting to launch!
ID: 8: all done!
ID: 9: it's my turn!
ID: 10: waiting to launch!
ID: 9: all done!
ID: 10: it's my turn!
ID: 10: all done!
0,32s user 0,03s system 6% cpu 5,274 total

Questions? Feedback? Hit me on Twitter @AntoineAugusti

Openness for engineering teams

Standard

As a student, I am quite often looking at companies to see what they are doing, to understand the market and discover trends. As an engineering student, I am on the lookout for technical content, written by engineers. I discovered recently that I value a lot openness for engineering teams. Being open can be done in different ways:

  • Having a technical blog. You can understand this in multiple ways. First, you can have a blog where you talk about new features, new releases of your API / SDK. This one is quite common. The second one is really rare and very valuable to me: you talk about your engineering process, your hiring process, you share reports of outages. If you have open source projects, you have a blog post to let the technical community know about it.
  • Involvement in communities. You can be involved in communities in multiple ways: regularly sending members of your team to local meetups (not just attending if you can. Presenting and volunteering are awesome), being visible in conferences, giving explicit credit to open source solutions you are using (or giving money to them if you can afford to), host hackathons or hack days at your office. Be explicit about causes you care about and defend them.
  • Open source. Whether you contribute to open source projects or you open source some of your projects, involvement in the community is a great way to gain some exposure, let people know which technologies you are using and giving back to the community.

An update to the Joel Test?

Maybe some of these points will be in an updated “Joel Test” in the future, even if some people already say that it is partially antiquated. Personally, I would add the following questions to an updated version of the Joel Test:

  • Do you support developer education by attending conferences, purchasing books (or something equivalent)?
  • Do you have a simple, documented process to adopt new tools your team uses?
  • Do you have an engineering blog where you talk about your processes, ideas, beliefs and failures?

You can’t have it all

Being able to answer “Yes” to every questions above seems fairly difficult, and really impossible for small engineering teams. If your company is 1 year old and you are 2 engineers, you cannot put all these things in place. But as they say, “practice makes perfect”, so try to keep these goals in mind. Giving an awesome work environment to your engineers will make them productive, happy to work and so much more! Great engineering teams attract great engineers.

Developing and deploying a modulus checking API

Standard

Following my latest post about a Go package to validate UK bank account numbers, I wanted to offer a public API to let people check if a UK bank account number is valid or not. I know that offering a Go package is not ideal for everyone because for the moment Go is not everywhere in the tech ecosystem, and it’s always convenient to have an API you can send requests to, especially in a frontend context. My goal was to offer a JSON API, supporting authentication thanks to a HTTP header and with rate limits. With this, in the future you could adapt rate limits to some API keys, if you want to allow a larger amount of requests for some clients.

Packages I used

I wanted to give cloudflare/service a go because it lets you build quickly JSON APIs with some default endpoints for heartbeat, version information, statistics and monitoring. I used etcinit/speedbump to offer the rate limiting functionality and it was very easy to use. Note that the rate limiting functionality requires a Redis server to store request counts. Finally, I used the famous codegangsta/negroni to create middlewares to handle API authentication and rate limits and keeping my only controller relatively clean.

Deploying behind Nginx

My constraints were the following:

  • The API should only be accessible via HTTPS and HTTP should redirect to HTTPS.
  • The Golang server should run on a port > 1024 and the firewall will block access to everything but ports 22, 80 and 443
  • The only endpoints that should be exposed to the public are /verify, /version and /heartbeat. Statistics and monitoring should be accessible by administrators on localhost through HTTP

I ended up with this Nginx virtual host to suit my needs, I’m not sure if it can be simpler:

geo $is_localhost {
  default 0;
  127.0.0.1/32 1;
}

server {
    listen 80;
    listen 443 ssl;

    server_name modulus.antoine-augusti.fr localhost.antoine-augusti.fr;

    ssl_certificate /etc/nginx/ssl/modulus.antoine-augusti.fr.crt;
    ssl_certificate_key /etc/nginx/ssl/modulus.antoine-augusti.fr.key;

   if ($is_localhost) {
      set $test A;
   }

    if ($scheme = http) {
      set $test "${test}B";
    }
    
    # Redirect to HTTPS if not connecting from localhost
    if ($test = B) {
      return 301 https://$server_name$request_uri;
    }
    
    # Only the following endpoints are accessible to people not on localhost
    location ~ ^/(verify|heartbeat|version)  {
      include sites-available/includes/dispatch-golang-server;
    }

    # Default case
    location / {
      # Not on localhost? End of game
      if ($is_localhost = 0) {
        return 403;
      }
      # Forward request for people on localhost
      include sites-available/includes/dispatch-golang-server;
    }
}

And for sites-available/includes/dispatch-golang-server:

proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header Host $host;
proxy_pass http://127.0.0.1:8080;

With this, I can still access the reserved endpoints by opening a SSH tunnel first with ssh -L4242:127.0.0.1:80 [email protected] and going to http://localhost.antoine-augusti.fr:4242/stats after.

Note that the Golang server is running on port 8080 and it should be monitored by Supervisor or whatever you want to use.

Grabbing the code and a working example

First of all, the API is available on GitHub under the MIT license so that you can deploy and adapt it yourself. If you want to test it first, you can use the API key foo against the base domain https://modulus.antoine-augusti.fr. Here is a cURL call for the sake of the example:

curl -H "Content-Type: application/json" -H "Api-Key: foo" -X POST -d '{"sort_code": "308037", "account_number": "12345678"}' https://modulus.antoine-augusti.fr/verify

Note that this API key is limited to 5 requests per minute. You’ve been warned 🙂 Looking for more requests per month or SLA, drop me a line.

Go challenge: validating UK bank account numbers

Standard

As I was reading through the SEPA specification, I found that it was not that simple to check if a UK bank account number was valid or not. If you’re not familiar with UK banks, they don’t use IBAN to transfer money within the UK, but a combination of a sort code and an account number. A sort code identifies the bank’s branch and each account has got an account number. A sort code is a 6 digits number and an account number can be between 6 and 11 digits, but most of them are 8 digits long.

For example, here is a valid UK bank account:

  • Sort code: 107999
  • Account number: 88837491

Algorithms to check if a UK bank account is valid

A very common way to check if a number (bank account, credit card, parking ticket…) is valid, is to apply a modulus algorithm. You perform an operation on each digit (addition, multiplication by a weight, substitution…), when you reach the end you divide by a specific number and you check that the remainder of the division is equal to something. Seems easy, right? Well, this is not that simple for UK bank accounts. In fact, if you want to go through the official specification on the Vocalink website, you will see that they use 2 algorithms, but they have also 15 exceptions to take into account (and some of them are weird or tricky to handle!). You will need to adapt the way you compute the modulus value according to a weight table also.

From the specification to a package

Reading the specification was interesting, but what really motivated me to code a Go package to solve this problem was the fact that test cases where provided in the specification! What a dream: the specification offers you 34 test cases, and they cover nearly all the exceptions. I jumped on the opportunity, it’s not that often that you are offered with a way to check that what you have done is actually right. In fact, I followed a Test Driven Developemnt aproach and it really guided me during the development and especially the refactoring.

Getting the code

The code is available on GitHub under the MIT license and should be well documented and tested. As always, pull requests and bug reports are welcome!

Here is an example:

package main

import (
    "fmt"

    "github.com/AntoineAugusti/moduluschecking/models"
    "github.com/AntoineAugusti/moduluschecking/parsers"
    "github.com/AntoineAugusti/moduluschecking/resolvers"
)

func main() {
    // Read the modulus weight table and the sorting
    // code substitution table from the data folder
    parser := parsers.CreateFileParser()

    // The resolver handles the verification of the validity of
    // bank accounts according to the data obtained by the parser
    resolver := resolvers.NewResolver(parser)

    // This helper method handles special cases for
    // bank accounts from:
    // - National Westminster Bank plc (10 or 11 digits with possible presence of dashes, for account numbers)
    // - Co-Operative Bank plc (10 digits for account numbers)
    // - Santander (9 digits for account numbers)
    // - banks with 6 or 7 digits for account numbers
    bankAccount := models.CreateBankAccount("089999", "66374958")

    // Check if the created bank account is valid against the rules
    fmt.Println(resolver.IsValid(bankAccount))
}

Continuous integration and code coverage in Golang

Standard

It took me some time to find the right setup and the right tools to achieve something not that complicated: continuous integration and coverage reports for Golang projects hosted on GitHub.

I’m happy to share my configuration with you, hopefully it will save you some time. I’m using Travis CI for the continuous integration platform and Codecov for code coverage reports. Both are free and easy to setup: you can get just log in using your GitHub account, you will be up and running in under 5 minutes.

Here is the Travis file (.travis.yml) I use:

language: go
before_install:
  - go get golang.org/x/tools/cmd/vet
  - go get github.com/modocache/gover
  - go get github.com/vendor/package/...
script:
  # Vet examines Go source code and reports suspicious construct
  - go vet github.com/vendor/package...
  # Run the unit tests suite
  - go test -v ./...
  # Collect coverage reports
  - go list -f '{{if len .TestGoFiles}}"go test -coverprofile={{.Dir}}/.coverprofile {{.ImportPath}}"{{end}}' ./... | xargs -i sh -c {}
  # Merge coverage reports
  - gover . coverprofile.txt
after_success:
  # Send coverage reports to Codecov
  - bash < (curl -s https://codecov.io/bash) -f coverprofile.txt

Replace github.com/vendor/package with your GitHub URL and you're good to go! You will be protected against yourself or contributors for your package. Unit tests will not break and coverage will not decrease. Or at least you will know when it happens!

Bonus: fancy badges

I like to put at the beginning of every README file a few information:

  • The status of the latest build (green is reassuring)
  • The software license, so that people immediately know if it's okay to use it for their project
  • A link to the GoDoc website, for documentation
  • The percentage of code covered by unit tests

If you want to do the same, here is what you can write at the very top of your README.md file:

# Travis CI for the master branch
[![Travis CI](https://img.shields.io/travis/vendor/package/master.svg?style=flat-square)](https://travis-ci.org/vendor/package)
# Note that this is for the MIT license and it expects a LICENSE.md file
[![Software License](https://img.shields.io/badge/License-MIT-orange.svg?style=flat-square)](https://github.com/vendor/package/blob/master/LICENSE.md)
# Link to GoDoc
[![GoDoc](https://img.shields.io/badge/godoc-reference-blue.svg?style=flat-square)](https://godoc.org/github.com/vendor/package)
# Codecov for the master branch
[![Coverage Status](http://codecov.io/github/vendor/package/coverage.svg?branch=master)](http://codecov.io/github/vendor/package?branch=master)

One more time, don’t forget to replace vendor/package (even in URLs) with your details and you’re good to go!

Demo

Head to AntoineAugusti/colors to see what it looks like.

Happy coding!

Word segmentation library in Golang

Standard

I’ve been into Golang lately, and today I’m glad to announce my second open source project in Golang, following the feature flags API. My second package is all about word segmentation.

What is the word segmentation problem?

Word segmentation is the process of dividing a phrase without spaces back into its constituent parts. For example, consider a phrase like thisisatest. Humans can immediately identify that the correct phrase should be this is a test. But for machines, this is a tricky problem.

An approach to this problem

A basic idea would be to use a dictionary, and then to try to split words if the current chunk of letters is a valid word. But then you run into issues with sentences like peanutbutter that you will split with this approach as pea nut butter instead of peanut butter.

The idea was to take advantage of frequencies of words in a corpus. This is where the concept of a n-gram is used. In the fields of computational linguistics and probability, an n-gram is a contiguous sequence of n items from a given sequence of text or speech. The items can be phonemes, syllables, letters, words or base pairs according to the application.

For example, this is an extract of some unigrams in a corpus composed of 1,024,908,267,229 words distributed by the Linguistic Data Consortium.

used 421438139
go 421086358
b 419765694
work 419483948
last 417601616
most 416210411
music 414028837
buy 410780176
data 406908328
make 405084642
them 403000411
should 402028056

Using unigrams and bigrams, we can score an arrangement of words. This is what is done in the score method for example.

Concurrency and channels

This was also a great opportunity for me to work with channels, because some parts of the program can be run in parallel. I’m just starting to work around goroutines and channels, but I really like it!

Take a look at the source code and the documentation on GitHub: github.com/AntoineAugusti/wordsegmentation

Feature flags API in golang

Standard

Over the last few months, I’ve been interested in golang (the Go language) but I didn’t know what to build to really try it. Sure, I’ve done the exercises from the online tutorial and I’ve read the awesome website Go by example, but I didn’t have a real use-case yet. Until a few days ago when I decided to build an API related to feature flags!

What are feature flags?

Feature flags let you enable or disable some features of your application, for example when you’re under unexpected traffic or when you want to let some users try a new feature you’ve been working on. They decouple feature release and code deployment, so that you can release features whenever you want, instead of whenever the code happens to ship.

With this package, you can enable the access of a feature for:

  • specific user IDs
  • specific groups
  • a percentage of your user base
  • everyone
  • no one

And you can combine things! You can give access to a feature for users in the group dev or admin and for users 1337 and 42 if you want to.

What I’ve learned

I guess it’s a rather complete project because it involves a storage layer (a key-value store, with bolt), some logic around a simple model (what is a feature? How do we control access to a feature?) and an HTTP layer (with the default HTTP server and gorilla/mux). Moreover I’ve tried to write some tests, and it was really interesting to discover the “Go way” to do it!

Anyway, I’ve learned a lot and I’m fairly happy with the codebase, but if you spot anything that can be improved or that is wrong, please do get in touch with me (GitHub issues and tweets are perfect).

Here is the source code: github.com/AntoineAugusti/feature-flags.

Keeping my brain busy in my free time

Standard

For the last few months, I’ve tried to do sometimes in my free time what is often called “code katas”.

A code kata is an exercise in programming which helps a programmer hone their skills through practice and repetition.

Basically, you try to solve problems by writing algorithms. If you don’t know what to solve, there are a lot of platforms you can look at to find exercises:
CodinGame
Prologin (in French)
Rosalind (bioinformatics challenges)
Kaggle (machine learning competitions)

On most of these platforms, you can “play” alone. You are given a problem with detailed instructions, a dataset and an example of the expected output. Your goal will be to solve this problem. You will know if you have successfully solved the problem when you will submit your solution (usually you can use the language of your choice) thanks to automated unit tests.

I like to solve unusual problems, and most of the time they are more difficult than what I work on on a daily basis (at least when you’ve done enough exercices). This is a great opportunity to discover advanced algorithms or to learn a new language. Because you do this in your time it’s always nice to know than these problems can usually be solved in less than 1 hour, and also that you will not need to maintain your implementation in the future! When you’ve successfully solved a problem, you’re done for real 🙂

If you wanna check what I’ve already done, and maybe to see how I’ve done it, I’ve got a public GitHub repository dedicated to this: github.com/AntoineAugusti/katas.

Learning English as a French guy

Standard

I have been really interested in the English language since I was 14 or 15 years old and I was reading technical documentation, mostly about web technologies and the MaNGOS project.

It became critical to be able to understand and to speak English during my Computer Science engineering studies, because it’s an important part of the job and also because it is required to get a minimal score to the TOEIC exam in order to have the Engineering degree.

Here is a list of things I’ve tried to do over the last 5 years to improve my English level:

  • TV shows. An amazing way to do something enjoyable while learning. You will get used to hear English and you will also discover new everyday words, words or expressions you cannot learn at school. Begin with subtitles in French, move to English subtitles and then remove subtitles! If you wanna track which series you are watching and when new episodes are available, use Betaseries. Try to watch at least 1 or 2 hours of series everyday, but stay focused while watching episodes 😉
  • Read in English about topics you care about. Like English Facebook pages, follow Twitter accounts, subscribe to RSS feeds about topics you care and you read daily about: your work, life tips, sports, music, whatever. Instead of reading it in your native language, the goal is to read it in English. Don’t just say “I am going to read 1 or 2 articles everyday in English” / “I am going to spend 45 minutes everyday to read articles English”. No, you should see English articles coming your way. In fact, you should read more articles in English than in your native language if you want to progress.
  • Be known as an English-speaking person. If you’ve got a Twitter account, tweet in English. If your Facebook friends are open-minded, post in English on Facebook also. Write your résumé in English on LinkedIn, have your personal website available in English. You need to write often in English, it should become a habit!
  • Use electronic devices in English. Switch your smartphone, laptop, tablet, watch, fridge, online services accounts, social media accounts, etc. in English. First of all you will discover new technical words and you will no longer suffer from poor translations in your native language!
  • Search on search engines in English. Trust me, you will find more content in English and this is another part of your training “I am going to write and to read everything in English”.
  • Travel or work in an English speaking country. This one is an obvious one, but this is is the ultimate thing you can do to improve your English: live in an English-speaking country for a few months. While living here, get out of your comfort zone and interact with people as much as you can. Embrace your differences and discover a new culture!
  • Train for the TOEIC exam. Okay, this test is definitely not perfect, but at least it says that you should be able to work in an English environment. The maximum score is 990, the closer to this, the better! A good score to this test immediately reflects on your résumé that you’ve a decent English level. If you look online, you will find training exercises. Get used to the format of the test so that you will not be surprised when you will pass it for real, under a lot of stress. Be warned: this is difficult to stay really focused during 1h30, with no pauses.

Decorator pattern and repositories

Standard

My use case

Lately I’ve been using a lot the decorator pattern with repositories on Teen Quotes. My use case is somewhat simple: I use a relational database (MySQL) but sometimes I what to cache the results of some queries in a key-value store (Redis / Memcached for instance). With something like that, I don’t need to hit my database for queries that are always run or are slow to run, I’ll hit my key-value store instead. It’ll reduce the pressure on my database and will give some results faster for the application.

The decorator pattern

If you’re not familiar with the decorator pattern yet, it’s quite simple to use and I’m sure you’ll love it in no time. Basically, the decorator pattern allows behavior to be added to an individual object, either statically or dynamically, without affecting the behavior of other objects from the same class. As Richard Bagshaw said, the idea is that you take an object, and you wrap this object in another object which provides additional functionality, and you keep wrapping extra classes repeatedly for each additional requirement.

If you want to see some real world examples, continue to read this blog post or go directly to Laracasts.

Some code

I’ll show you something I’ve been working on last week: the ability to add tags to quotes. A “tag” is like a category for a “quote” (a post, an article, whatever you want to call it). I’m using Laravel with Eloquent for my relational database. I’ve created an interface called TagRepository.

namespace TeenQuotes\Tags\Repositories;

use TeenQuotes\Tags\Models\Tag;
use TeenQuotes\Quotes\Models\Quote;

class DbTagRepository implements TagRepository {

  /**
   * Create a new tag
   *
   * @param  string $name
   * @return \TeenQuotes\Tags\Models\Tag
   */
  public function create($name)
  {
    return Tag::create(compact('name'));
  }

  /**
   * Get a tag thanks to its name
   *
   * @param  string $name
   * @return \TeenQuotes\Tags\Models\Tag|null
   */
  public function getByName($name)
  {
    return Tag::whereName($name)->first();
  }

  /**
   * Add a tag to a quote
   *
   * @param  \TeenQuotes\Quotes\Models\Quote $q
   * @param  \TeenQuotes\Tags\Models\Tag $t
   */
  public function tagQuote(Quote $q, Tag $t)
  {
    $q->tags()->attach($t);
  }

  /**
   * Remove a tag from a quote
   *
   * @param  \TeenQuotes\Quotes\Models\Quote $q
   * @param  \TeenQuotes\Tags\Models\Tag $t
   */
  public function untagQuote(Quote $q, Tag $t)
  {
    $q->tags()->detach($t);
  }

  /**
   * Get a list of tags for a given quote
   *
   * @param  \TeenQuotes\Quotes\Models\Quote $q
   * @return array
   */
  public function tagsForQuote(Quote $q)
  {
    return $q->tags()->lists('name');
  }

  /**
   * Get the total number of quotes having a tag
   *
   * @param  \TeenQuotes\Tags\Models\Tag $t
   * @return int
   */
  public function totalQuotesForTag(Tag $t)
  {
    return $t->quotes()->count();
  }
}

Pretty simple stuff, I’m sure you’ve seen this multiple times. Let’s move on to the interesting part: the caching layer. We will create a new class CachingTagRepository implementing the same interface TagRepository. The key thing is that we’ll require a TagRepository class to be given in the constructor of this new class. Ultimately, we will pass the DB layer here.

 
namespace TeenQuotes\Tags\Repositories;

use Cache;
use TeenQuotes\Tags\Models\Tag;
use TeenQuotes\Quotes\Models\Quote;

class CachingTagRepository implements TagRepository {

  /**
   * @var \TeenQuotes\Tags\Repositories\TagRepository
   */
  private $tags;

  public function __construct(TagRepository $tags)
  {
    // The key thing is here: we assume we've already
    // a class that is implementing the interface.
    // We can rely on that!
    $this->tags = $tags;
  }

  /**
   * Create a new tag
   *
   * @param  string $name
   * @return \TeenQuotes\Tags\Models\Tag
   */
  public function create($name)
  {
    return $this->tags->create($name);
  }

  /**
   * Get a tag thanks to its name
   *
   * @param  string $name
   * @return \TeenQuotes\Tags\Models\Tag|null
   */
  public function getByName($name)
  {
    $callback = function() use ($name)
    {
      return $this->tags->getByName($name);
    };

    return Cache::rememberForever('tags.name-'.$name, $callback);
  }

  /**
   * Add a tag to a quote
   *
   * @param  \TeenQuotes\Quotes\Models\Quote $q
   * @param  \TeenQuotes\Tags\Models\Tag $t
   */
  public function tagQuote(Quote $q, Tag $t)
  {
    Cache::forget($this->cacheNameForListTags($q));

    $keyTotal = $this->cacheNameTotalQuotesForTag($t);

    if (Cache::has($keyTotal))
      Cache::increment($keyTotal);

    return $this->tags->tagQuote($q, $t);
  }

  /**
   * Remove a tag from a quote
   *
   * @param  \TeenQuotes\Quotes\Models\Quote $q
   * @param  \TeenQuotes\Tags\Models\Tag $t
   */
  public function untagQuote(Quote $q, Tag $t)
  {
    Cache::forget($this->cacheNameForListTags($q));

    $keyTotal = $this->cacheNameTotalQuotesForTag($t);

    if (Cache::has($keyTotal))
      Cache::decrement($keyTotal);

    return $this->tags->untagQuote($q, $t);
  }

  /**
   * Get a list of tags for a given quote
   *
   * @param  \TeenQuotes\Quotes\Models\Quote $q
   * @return array
   */
  public function tagsForQuote(Quote $q)
  {
    $key = $this->cacheNameForListTags($q);

    $callback = function() use($q)
    {
      return $this->tags->tagsForQuote($q);
    };

    return Cache::remember($key, 10, $callback);
  }

  /**
   * Get the total number of quotes having a tag
   *
   * @param  \TeenQuotes\Tags\Models\Tag $t
   * @return int
   */
  public function totalQuotesForTag(Tag $t)
  {
    $key = $this->cacheNameTotalQuotesForTag($t);

    $callback = function() use ($t)
    {
      return $this->tags->totalQuotesForTag($t);
    };

    return Cache::remember($key, 10, $callback);
  }

  /**
   * Get the key name when we list tags for a quote
   *
   * @param  \TeenQuotes\Quotes\Models\Quote $q
   * @return string
   */
  private function cacheNameForListTags(Quote $q)
  {
    return 'tags.quote-'.$q->id.'.list-name';
  }

  /**
   * Get the key name to have the number of quotes
   * having a tag
   *
   * @param  \TeenQuotes\Tags\Models\Tag $t
   * @return string
   */
  private function cacheNameTotalQuotesForTag(Tag $t)
  {
    return 'tags.tag-'.$t->name.'.total-quotes';
  }
}

You see, we do some things before (or after) calling the initial implementation, to add some functionalities (here a caching layer). Sometimes we directly defer to the initial implementation (see the create method).

Bonus: registering that in the IoC container

Let’s bind our TagRepository interface to the caching layer and the storage layer in a service provider!

namespace TeenQuotes\Tags;

use Illuminate\Support\ServiceProvider;
use TeenQuotes\Tags\Repositories\CachingTagRepository;
use TeenQuotes\Tags\Repositories\DbTagRepository;
use TeenQuotes\Tags\Repositories\TagRepository;

class TagsServiceProvider extends ServiceProvider {

  /**
   * Bootstrap the application events.
   *
   * @return void
   */
  public function boot()
  {
      //
  }

  /**
   * Register the service provider.
   *
   * @return void
   */
  public function register()
  {
      $this->registerBindings();
  }

  private function registerBindings()
  {
      $this->app->bind(TagRepository::class, function()
      {
          return new CachingTagRepository(new DbTagRepository);
      });
  }
}

Et voilà ! Happy coding!