Authoring content in one place and publishing it to multiple locations offers many benefits. Reducing the editorial effort needed to keep content synchronized across multiple locations is often the primary benefit. However, providing a consistent authoring interface, a canonical cross-brand asset library, and increasing monetization of existing content are often ancillary benefits that can be just as important.
We always get excited when we can partner with our clients to build a new publishing experience or improve an existing one, as we love to use technology to solve problems. However, the excitement can soon be tempered by the reality of potential challenges. Through our experience building syndication systems, we have solved many challenges and found some tactical approaches and strategic principles that guide us through the issues that must be considered along the way.
Should we even be doing this and is it really worth it? This should always be the first question. Sometimes the shiny new functionality might not actually be the best idea when all factors are considered. Some questions to ask while making this decision:
- How much content needs to be shared?
- How many places will it be shared?
- How often will the content change?
- How quickly do edits need to be synchronized?
- How will associated media such as images and video be shared?
- What are the costs/time considerations with manually keeping the content in sync?
- What are the costs to build and maintain an automated system and how much complexity does it add for the editorial team and developers?
- Is there an existing solution or will a custom approach be needed?
Depending on the complexity of the system, understanding what content is available and where it has been published is important, but can quickly become difficult to manage. Planning for this up-front will be essential to the long-term success of your system, ensuring a great user experience that scales beyond the proof-of-concept.
Providing contextual data on content edit forms that inform editors where a given piece of content has been syndicated to and where the current updates will appear is a great start. Filterable content listing admin pages with batch functionality built-in will also be valuable. This allows for editors to easily understand and change where content is published at scale.
Understanding the expectations around the syndication process is important for selecting the technology used and ensuring the system meets business expectations. If content needs to be live on consuming sites within 2 seconds, that is a much different requirement than 30 seconds, or even 5 minutes. Every layer between the end user and the syndication platform must be evaluated holistically with caching considerations also being taken into account.
Google isn’t a big fan of duplicate content, yet that is precisely what syndicating content seeks to do. Understanding this risk and mitigating it with a canonical URL and other measures is vitally important and can’t be an afterthought.
The creation and management of URLs is one of the primary challenges of sharing content and should be carefully considered from the start of a project. Some questions to consider:
- Where are URL aliases generated?
- What data is used to generate URL aliases?
- Will the alias remain identical on all sites?
- If the alias is different, do other sites need to know the alias for cross-site links?
- Will there be cross-site search and what information will it need to build links?
- How will sites know what site content lives on canonically and determine the domain?
- Where will redirects be created and managed?
This is where things really get fun. There are many challenges and also many opportunities for efficiency, as reducing the editorial workload is likely what started this whole adventure in the first place. Some questions to consider through the process:
- What level of control will be needed over the content on the consuming sites?
- Will additional edits/updates need to be syndicated?
- Are there any per site customizations? Do those take precedence over updates?
- Where will curated lists of featured content be created and managed? Will that be syndicated as well?
- Do editors need previews of the content on one or many of the client sites from the editing interface within the content syndication platform?
- How will marketing pages built with Gutenberg or Layout Builder be shared and how will designs translate across sites?
There are a myriad of technical considerations and approaches depending on the requirements and the technology that is used. One of the first considerations should be to keep things generic and avoid tight couplings between the data models in the syndication platform and the consuming sites. Some field values can be mapped to identical fields in the client sites such as taxonomy terms and author information, as that data will likely be needed for querying data for lists. However, all other data can often be stored in a JSON field that can be decoded and made available to templates.
This all begins to border on a decoupled approach, and with this approach, it helps to set a project up for success if the front-end goes fully decoupled later. Or perhaps now is the time to consider going decoupled while a foundational evaluation of editorial workflows is already in progress.
Sharing is Caring
They say sharing is caring, but only share content if you can take the time to care about all of the details along the way. With thoughtful consideration at the very beginning, from challenging the very need for the functionality all the way to the technical details that set canonical metatags, analyzing each step of the process ensures a successful outcome. There is no easy answer, but hopefully this helps you avoid many of the pitfalls. If you have a project with content syndication coming up or have experience with it, drop us a line on Twitter @ChromaticHQ, we would love to hear your thoughts.