Skip to content

Pulp's Blog

Using Pulp at Microsoft: Coming Full Circle

Hi, I’m David. If you've been using Pulp for long enough, there's a good chance you may have encountered a bug I contributed. I worked on Pulp for many years while at Red Hat, from the beginning of the Pulp 3 back when it was just a proof of concept. Those early days were full of whiteboarding and discussions around which technologies to use (e.g. futures or asyncio?), whether to create a CLI or Web UI, and how to integrate Pulp 3 with products like Satellite and Ansible Galaxy.

Fast-forward to today, I now work at Microsoft on the Azure Core Linux team. And in a really interesting bit of serendipity, we've spent the past few years using Pulp to overhaul packages.microsoft.com, Microsoft's service for Linux package repositories. I've gone from developing Pulp full-time to running it as a user.

Writing Docs with AI - An Experiment in Workflow

Writing documentation can often be a significant effort. Recently, I needed to document a new feature, the "Alternate Content Source". But where to start? The core information existed in a user conversation on Matrix. How could I turn that scattered information into structured, usable documentation efficiently? I decided to experiment with an AI tool, specifically NotebookLM, to see if it could streamline the process.

The starting point was straightforward. I took the raw Matrix conversation and uploaded it as a source into NotebookLM. I also added the existing pulpproject.org website as another source, hoping it would provide context and stylistic examples (Note: this specific detail about adding the pulpproject.org site comes from my process description, not the provided source text). My initial prompt was simple: "write me markdown docs from this user conversation". It produced the content for this pull requested.

Checkpoint Support - A Journey Towards Predictable and Consistent Deployments

In the ever-evolving landscape of software development, ensuring predictability and consistency in deployments has always been a challenge. Our team faced similar hurdles, especially when managing historical versions of repositories. We needed a solution that could help us recreate environments from specific points in time, ensure reproducible deployments, and track changes in package behavior over time.

Inspired by the success stories of Ubuntu Snapshots on Azure and the increased security and resiliency of Canonical workloads on Azure, we embarked on a journey to develop a feature that could address these challenges.

Whenever, Whatever-based Release Cycle

"Release fast, release often!" is not a metaphor, it is a mantra.

Pulp used to be a service mainly running on-premises as part of products with extremely long release cycles. About two years ago, we heard rumors to add highly volatile cloud service style installations to the mix. Because releasing a new version of pulpcore or any of its many plugins was a herculean task, the rumors were accompanied by very concerning sentences like: "We will run this service off of main branches of all the plugins." If that does not make you shiver, you can just stop reading here and go on with your happy life.

Pulp UI Beta out now in Pulp images

Pulp-UI (beta) is now available in the latest pulp/pulp image. This was a huge effort over the last two months to get the beta into the hands of our users. Big thanks to Martin Hradil, Zita Nemeckova and everyone who helped us get here. Check out the demo below:

RPM Content Service Performance and Scale Testing

Goals

  1. Help pulp_rpm users plan the number of pulp-content apps to serve a given traffic rate for their object-storage backed installation.

  2. Determine what the architectural scalability limit is for pulp_rpm content serving.

To achieve these goals in this blog post, I'll be doing two things:

  • Characterize the relationship between sustained requests / sec to the pulp-content app and the number of pulp-content apps without increases in latency.

  • Identify rate limiting components in various test scenarios.