Wednesday, 15 November, 2017 UTC


Summary

According to its 2017 State of the Octoverse report, Github now hosts more than 25 million active public repositories. Many of these projects are tools made by developers for developers, and being able to search through the masses and find exactly the right tool for the right problem has become a valuable skill of its own.
CIOs, engineering directors and project managers don’t only rely on developers for tool choice because of their technical expertise. They also understand that developer happiness equates to developer productivity—which with engineering salaries and talent shortages at a historic high—translates directly into higher quality products and lower total-cost-of-ownership for internal applications.
Developer experience, it turns out, is good for business.
Tools with great developer experience are marked by intuitive design, up-to-date documentation, clear error messages, rock-solid stability and access to friendly support. Underpinning all of that, the tool must solve the problem the developer needs it to.
Developers love fast test suites and code that’s easy to follow
Developer Experience in search
Both Algolia and Elasticsearch have risen in popularity because they provide a better developer experience than the search platforms of the past.
Though a critical feature of many applications, search remains a challenging problem for developers to solve. Moreover, building search as fast and relevant as Google and Amazon—companies that set the consumer’s expectation of what search should be—is a great deal harder than building the press-enter-and-wait search of the bygone Web 1.0 era.
To compete, engineering teams need to go faster than ever before, identifying and adopting tools that shorten the developer journey and enhance the developer experience at each stop along the way.
If developer journey is the “where”, developer experience is the “how smoothly”
 
In Part 1 of the Comparing Algolia and Elasticsearch for Consumer-Grade Search series—End-to-end Latency—we saw that search performance requirements are more demanding than for other types of applications. In Part 2—Relevance Isn’t Luck—we dove deep and looked at how search engines accomplish the task of serving results that accurately match user intent, even when intent is scarce, flawed or unexpected.
Here in Part 3, we’ll look at the high-level journey that developers take to build search, using an API like Algolia or an open source tool like Elasticsearch. We’ll also focus in one of the parts of the journey that’s important to consider early on—the crafting of the user experience.
Five aspects of a “SUPIR” search
No matter what search technology a developer is using, there are five aspects that the developer must consider on the way from design to production. These are:
  • Scalability
  • User Experience
  • Performance
  • Indexing
  • Relevance
Remember them easily as a typo-tolerant acronym – SUPIR.
Ordering the steps this way makes a convenient acronym, but it is generally not the order that the developer proceeds in. Nor is the order linear. For example, indexing data is required before testing out any relevance or user experience, and is therefore a key step during an early prototyping phase. But indexing will also be revisited later, when additional signals become available for relevance or when scaling puts stricter demands on record size.

What makes search DX different

The journey to building a great search is not linear. Just as the world that your users live in changes frequently, so must their search. Like performance tuning, relevance tuning is never “done”, it can only be “done for now”.
This is the first differentiator between building search and building other types of applications. Developers don’t visit each SUPIR aspect just once but on a progressive and recurring basis, roughing out and sharpening each one during design and build, and later maintaining and improving them as an ensemble in production.
The second differentiator between search and other relational or NoSQL database-backed applications is the tighter level of coupling required between each layer of the stack. In traditional CRUD-based apps, the implementation details of the database’s storage engine have little bearing on the types of user experiences that can be built on top of it. This is not true in search—the feasibility of a particular user experience can come down to specific implementation details like whether the search engine uses Tries or not.
The iterative nature and tight level of coupling in search can be the source of numerous headaches for developers. Improving one SUPIR aspect can break several others, a frustrating event that feels more like retracing steps than moving forward. Some examples:
  • A relevance improvement for one query makes all queries slower
  • A relevance improvement for one query worsens relevance for others
  • At scale, heavy indexing loads begin to degrade search performance
  • Degraded search performance requires implementing UX workarounds
As we head to the next sections and start to address the individual aspects of SUPIR, we’ll take care to point out what developers using Algolia or Elasticsearch need to do to avoid these pitfalls. First up, we’ll look at the “U” in SUPIR—user experience.
Cryptic error messages cause developers distress
 
User Experience
In search, it’s important to begin with the end in mind. What should happen when the user starts typing in that shiny new search box? The user interface (UI) and user experience (UX) will heavily affect the design of the other four SUPIR aspects, so it’s important to sketch it out early.

Design and best practices

There are many different types of search UX, ranging from a simple keyword-based autosuggest up to a full-page search-as-you-type experience with several ways to facet and filter. Today’s search experiences also often include powerful browsing and navigation capabilities.
To be on par with what we expect from Google or Amazon, the search should feel like it’s a step ahead of us and practically guessing what we want. It should also provide instant feedback so we can correct our mistakes or refine our search, and it should work wherever we happen to be computing from. A satisfying search UX will:
  • Update results immediately as the user types, putting them in control of the experience
  • Handle spelling mistakes, omissions and concatenations
  • Make an obvious connection between search terms and results
  • Work on all web and mobile platforms
In many cases, these additional search UX patterns will be used to enhance the experience:
  • Hierarchical faceting and filtering
  • Browsing with or without keywords, including pagination
  • Searching multiple different data types simultaneously
 
A powerful product search with instant results
 
Once the initial requirements and design for the UI/UX have been decided on, the next step for the developer is to start building it. Ideally, they won’t have to start from scratch.

Frontend API clients

Both Algolia and Elasticsearch have a REST API that makes it possible to interact with them over HTTP—therefore directly via web browsers and mobile devices. API clients contain the logic to send requests over HTTP, parse responses, and handle network and security concerns. They’re an essential building block for user interfaces and user interface libraries built on top of them. They’re also one of the first places where the developer looks for guidance before they start coding.
Algolia and Elasticsearch both have a JavaScript API client that works in the browser and node.js. Algolia also has official native API clients for iOS and Android.

API clients, security and performance

API clients must also implement security, including authentication and authorization logic. In search, this goes beyond just passing auth credentials. Just because a user can search an index doesn’t necessarily mean they’re allowed to retrieve all records, or see all of the fields of each record.
When using Elasticsearch, this level of fine-grained security is often implemented by an application server, rather than exposing Elasticsearch to frontend clients directly. You can think of the parallel to your primary application database like Postgresql or MongoDB—these databases are accessed directly by your servers, not your users’ clients. The advantage of this is flexibility—you can implement any authorization scheme you like—but the disadvantage comes in having to build and maintain it, thereby adding an additional step to the developer journey. A second disadvantage comes in terms of performance, as every search query will need to be processed by an application server that sits in front of your Elasticsearch instance, and it is likely that the application server will perform one or more calls to your relational database in order to authorize the HTTP request. Many precious milliseconds are consumed therein.
When Elasticsearch is accessed this way, the developer does not use a frontend Elasticsearch API client at all, but connects to an existing backend which proxies the request to an Elasticsearch instance.
Algolia is designed to be called directly from the client, with authentication and authorization being handled by passing API keys. Algolia secured API keys can encode a lot more information than just an authorizing token, including index and search parameter restrictions, ipv4 source restrictions, rate limits and expiry information. This gives developers flexibility when it comes to security, without slowing them down.
The direct connection between frontend API client and Algolia host is also fast. The entire search request is fulfilled by an Algolia module running inside of NGINX—no expensive application servers or database calls required. Lastly, the connection is reliable. API clients are aware of all 3 Algolia hosts in a cluster, as well as any global DSN replicas, and can work around any downed hosts. Additionally, Algolia API clients can fallback to a second DNS provider in order to work around DDoS attacks, like the 30-50+ Gbps attack on NS1 in May 2016.
So while API clients may seem simple at first glance, the details can have a big impact on both performance and reliability — and therefore the quality of the user experience. If developers don’t have to reinvent the wheel just to pass data back and forth between clients and servers, their experience (and productivity) will be vastly improved.

Reusable UI building blocks

Though the API client is fundamental, it’s still only a small piece of the overall frontend story. User-initiated keystrokes and clicks must be handled, results must be displayed and made selectable. That could be via an autocomplete menu or a more advanced page with controls for faceting, filtering and pagination. In either case, this can add up to a lot of custom HTML, JavaScript and CSS.
In the Elasticsearch ecosystem, community developers have created projects like SearchKit and ElasticUI to help create these interfaces.
In the Algolia ecosystem, an officially-supported family of libraries called InstantSearch was created to encapsulate the various consumer-grade UI/UX search best practices and make them easy to implement. Each InstantSearch library is composed of a set of UI widgets that can be added one by one to a search page. Here are examples of common widgets:
  • Search box
  • Hits (for displaying results)
  • Sort by selector
  • Range slider
  • Refinement list (for faceting)
The framework-agnostic JavaScript InstantSearch library, simply called InstantSearch.js, contains a total of 18 ready-made widgets. It also exposes factories for creating others. Each widget is connected to any change in the state of the search and re-renders automatically. Other members of the InstantSearch family work similarly. Here’s a quick look at each one.
Full-featured API clients and component libraries provide a good DX for building the UX— the “U” in our SUPIR search. They also reduce maintenance and total cost of ownership by cutting down on custom code. Component libraries in particular mean that the engineer building the UI doesn’t need to be a JavaScript expert; in most cases a basic working knowledge of JavaScript will do.
Conclusion
In a consumer-grade search implementation, the user experience is critical. Consumers are picky and, if unsatisfied by a site or mobile app’s search, are likely to abandon it in search of an alternative.
Because we rely on developers to build the amazing search user experience that we desire, we have to account for their experience as well. A tired, frustrated development team can’t compete with a team that is happy and productive. A developer mired in the details of applying security patches or debugging deeply-nested JavaScript promises may not have the time—or the energy—to make their product’s ambitious design spec a reality.
With an open source and more open-ended tool like Elasticsearch, the degree to which a developer gets mired or inspired is largely a function of their search-building experience and expertise. With Algolia, many of the most complex moving parts of the search-building journey—including infrastructure, performance and scalability—are handled automatically underneath the API, freeing up the developer to focus on building a great user experience and ultimately a great product.
In Part 3 of this series, we explored the developer journey and developer experience of building search. We focused on user experience first because it impacts every other aspect of the search. In future parts of this series, we will continue to explore the search developer experience through the lens of the other four SUPIR aspects—indexing, relevance, performance, and scalability.
The post Comparing Algolia and Elasticsearch for Consumer-Grade Search Part 3: Developer Experience and UX appeared first on Milliseconds Matter.