To help you navigate through this guide, it is divided into several large parts. Each part addresses a particular broad topic concerning GStreamer plugin development. The parts of this guide are laid out in the following order:
Building a Plugin - Introduction to the structure of a plugin, using an example audio filter for illustration.
This part covers all the basic steps you generally need to perform to build a plugin, such as registering the element with GStreamer and setting up the basics so it can receive data from and send data to neighbour elements. The discussion begins by giving examples of generating the basic structures and registering an element in Constructing the Boilerplate. Then, you will learn how to write the code to get a basic filter plugin working in Chapter 4, Specifying the pads, Chapter 5, The chain function and Chapter 8, What are states?.
After that, we will show some of the GObject concepts on how to make an element configurable for applications and how to do application-element interaction in Adding Properties and Chapter 10, Signals. Next, you will learn to build a quick test application to test all that you've just learned in Chapter 11, Building a Test Application. We will just touch upon basics here. For full-blown application development, you should look at the Application Development Manual.
Advanced Filter Concepts - Information on advanced features of GStreamer plugin development.
After learning about the basic steps, you should be able to create a functional audio or video filter plugin with some nice features. However, GStreamer offers more for plugin writers. This part of the guide includes chapters on more advanced topics, such as scheduling, media type definitions in GStreamer, clocks, interfaces and tagging. Since these features are purpose-specific, you can read them in any order, most of them don't require knowledge from other sections.
The first chapter, named Different scheduling modes, will explain some of the basics of element scheduling. It is not very in-depth, but is mostly some sort of an introduction on why other things work as they do. Read this chapter if you're interested in GStreamer internals. Next, we will apply this knowledge and discuss another type of data transmission than what you learned in Chapter 5, The chain function: Different scheduling modes. Loop-based elements will give you more control over input rate. This is useful when writing, for example, muxers or demuxers.
Next, we will discuss media identification in GStreamer in Chapter 16, Types and Properties. You will learn how to define new media types and get to know a list of standard media types defined in GStreamer.
In the next chapter, you will learn the concept of request- and sometimes-pads, which are pads that are created dynamically, either because the application asked for it (request) or because the media stream requires it (sometimes). This will be in Chapter 12, Request and Sometimes pads.
The next chapter, Chapter 18, Clocking, will explain the concept of clocks in GStreamer. You need this information when you want to know how elements should achieve audio/video synchronization.
The next few chapters will discuss advanced ways of doing application-element interaction. Previously, we learned on the GObject-ways of doing this in Adding Properties and Chapter 10, Signals. We will discuss dynamic parameters, which are a way of defining element behaviour over time in advance, in Chapter 20, Supporting Dynamic Parameters. Next, you will learn about interfaces in Chapter 21, Interfaces. Interfaces are very target- specific ways of application-element interaction, based on GObject's GInterface. Lastly, you will learn about how metadata is handled in GStreamer in Chapter 22, Tagging (Metadata and Streaminfo).
The last chapter, Chapter 17, Events: Seeking, Navigation and More, will discuss the concept of events in GStreamer. Events are, on the one hand, another way of doing application-element interaction. It takes care of seeking, for example. On the other hand, it is also a way in which elements interact with each other, such as letting each other know about media stream discontinuities, forwarding tags inside a pipeline and so on.
Creating special element types - Explanation of writing other plugin types.
Because the first two parts of the guide use an audio filter as an example, the concepts introduced apply to filter plugins. But many of the concepts apply equally to other plugin types, including sources, sinks, and autopluggers. This part of the guide presents the issues that arise when working on these more specialized plugin types. The chapter starts with a special focus on elements that can be written using a base-class (Pre-made base classes), and later also goes into writing special types of elements in Writing a Demuxer or Parser, Writing a N-to-1 Element or Muxer and Writing a Manager.
Appendices - Further information for plugin developers.
The appendices contain some information that stubbornly refuses to fit cleanly in other sections of the guide. Most of this section is not yet finished.
The remainder of this introductory part of the guide presents a short overview of the basic concepts involved in GStreamer plugin development. Topics covered include Elements and Plugins, Pads, Data, Buffers and Events and Types and Properties. If you are already familiar with this information, you can use this short overview to refresh your memory, or you can skip to Building a Plugin.
As you can see, there a lot to learn, so let's get started!
Creating compound and complex elements by extending from a GstBin. This will allow you to create plugins that have other plugins embedded in them.
Adding new media types to the registry along with typedetect functions. This will allow your plugin to operate on a completely new media type.