Merging gst-editing-services

This commit is contained in:
Thibault Saunier 2021-09-24 16:15:25 -03:00
commit 6bb75890d9
327 changed files with 140768 additions and 0 deletions

6
.arcconfig Normal file
View file

@ -0,0 +1,6 @@
{
"phabricator.uri" : "https://phabricator.freedesktop.org/",
"repository.callsign" : "GES",
"project": "GStreamer Editing Services",
"default-reviewers": "thiblahute,Mathieu_Du"
}

9
.gitignore vendored Normal file
View file

@ -0,0 +1,9 @@
/build/
/b/
/_build/
*~
core.*
core
log

1
.gitlab-ci.yml Normal file
View file

@ -0,0 +1 @@
include: "https://gitlab.freedesktop.org/gstreamer/gst-ci/raw/master/gitlab/ci_template.yml"

2
AUTHORS Normal file
View file

@ -0,0 +1,2 @@
Edward Hervey <edward.hervey@collabora.co.uk>
Brandon Lewis <brandon.lewis@collabora.co.uk>

481
COPYING Normal file
View file

@ -0,0 +1,481 @@
GNU LIBRARY GENERAL PUBLIC LICENSE
Version 2, June 1991
Copyright (C) 1991 Free Software Foundation, Inc.
51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
[This is the first released version of the library GPL. It is
numbered 2 because it goes with version 2 of the ordinary GPL.]
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
Licenses are intended to guarantee your freedom to share and change
free software--to make sure the software is free for all its users.
This license, the Library General Public License, applies to some
specially designated Free Software Foundation software, and to any
other libraries whose authors decide to use it. You can use it for
your libraries, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if
you distribute copies of the library, or if you modify it.
For example, if you distribute copies of the library, whether gratis
or for a fee, you must give the recipients all the rights that we gave
you. You must make sure that they, too, receive or can get the source
code. If you link a program with the library, you must provide
complete object files to the recipients so that they can relink them
with the library, after making changes to the library and recompiling
it. And you must show them these terms so they know their rights.
Our method of protecting your rights has two steps: (1) copyright
the library, and (2) offer you this license which gives you legal
permission to copy, distribute and/or modify the library.
Also, for each distributor's protection, we want to make certain
that everyone understands that there is no warranty for this free
library. If the library is modified by someone else and passed on, we
want its recipients to know that what they have is not the original
version, so that any problems introduced by others will not reflect on
the original authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that companies distributing free
software will individually obtain patent licenses, thus in effect
transforming the program into proprietary software. To prevent this,
we have made it clear that any patent must be licensed for everyone's
free use or not licensed at all.
Most GNU software, including some libraries, is covered by the ordinary
GNU General Public License, which was designed for utility programs. This
license, the GNU Library General Public License, applies to certain
designated libraries. This license is quite different from the ordinary
one; be sure to read it in full, and don't assume that anything in it is
the same as in the ordinary license.
The reason we have a separate public license for some libraries is that
they blur the distinction we usually make between modifying or adding to a
program and simply using it. Linking a program with a library, without
changing the library, is in some sense simply using the library, and is
analogous to running a utility program or application program. However, in
a textual and legal sense, the linked executable is a combined work, a
derivative of the original library, and the ordinary General Public License
treats it as such.
Because of this blurred distinction, using the ordinary General
Public License for libraries did not effectively promote software
sharing, because most developers did not use the libraries. We
concluded that weaker conditions might promote sharing better.
However, unrestricted linking of non-free programs would deprive the
users of those programs of all benefit from the free status of the
libraries themselves. This Library General Public License is intended to
permit developers of non-free programs to use free libraries, while
preserving your freedom as a user of such programs to change the free
libraries that are incorporated in them. (We have not seen how to achieve
this as regards changes in header files, but we have achieved it as regards
changes in the actual functions of the Library.) The hope is that this
will lead to faster development of free libraries.
The precise terms and conditions for copying, distribution and
modification follow. Pay close attention to the difference between a
"work based on the library" and a "work that uses the library". The
former contains code derived from the library, while the latter only
works together with the library.
Note that it is possible for a library to be covered by the ordinary
General Public License rather than by this special one.
GNU LIBRARY GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License Agreement applies to any software library which
contains a notice placed by the copyright holder or other authorized
party saying it may be distributed under the terms of this Library
General Public License (also called "this License"). Each licensee is
addressed as "you".
A "library" means a collection of software functions and/or data
prepared so as to be conveniently linked with application programs
(which use some of those functions and data) to form executables.
The "Library", below, refers to any such software library or work
which has been distributed under these terms. A "work based on the
Library" means either the Library or any derivative work under
copyright law: that is to say, a work containing the Library or a
portion of it, either verbatim or with modifications and/or translated
straightforwardly into another language. (Hereinafter, translation is
included without limitation in the term "modification".)
"Source code" for a work means the preferred form of the work for
making modifications to it. For a library, complete source code means
all the source code for all modules it contains, plus any associated
interface definition files, plus the scripts used to control compilation
and installation of the library.
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running a program using the Library is not restricted, and output from
such a program is covered only if its contents constitute a work based
on the Library (independent of the use of the Library in a tool for
writing it). Whether that is true depends on what the Library does
and what the program that uses the Library does.
1. You may copy and distribute verbatim copies of the Library's
complete source code as you receive it, in any medium, provided that
you conspicuously and appropriately publish on each copy an
appropriate copyright notice and disclaimer of warranty; keep intact
all the notices that refer to this License and to the absence of any
warranty; and distribute a copy of this License along with the
Library.
You may charge a fee for the physical act of transferring a copy,
and you may at your option offer warranty protection in exchange for a
fee.
2. You may modify your copy or copies of the Library or any portion
of it, thus forming a work based on the Library, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) The modified work must itself be a software library.
b) You must cause the files modified to carry prominent notices
stating that you changed the files and the date of any change.
c) You must cause the whole of the work to be licensed at no
charge to all third parties under the terms of this License.
d) If a facility in the modified Library refers to a function or a
table of data to be supplied by an application program that uses
the facility, other than as an argument passed when the facility
is invoked, then you must make a good faith effort to ensure that,
in the event an application does not supply such function or
table, the facility still operates, and performs whatever part of
its purpose remains meaningful.
(For example, a function in a library to compute square roots has
a purpose that is entirely well-defined independent of the
application. Therefore, Subsection 2d requires that any
application-supplied function or table used by this function must
be optional: if the application does not supply it, the square
root function must still compute square roots.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Library,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Library, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote
it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Library.
In addition, mere aggregation of another work not based on the Library
with the Library (or with a work based on the Library) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may opt to apply the terms of the ordinary GNU General Public
License instead of this License to a given copy of the Library. To do
this, you must alter all the notices that refer to this License, so
that they refer to the ordinary GNU General Public License, version 2,
instead of to this License. (If a newer version than version 2 of the
ordinary GNU General Public License has appeared, then you can specify
that version instead if you wish.) Do not make any other change in
these notices.
Once this change is made in a given copy, it is irreversible for
that copy, so the ordinary GNU General Public License applies to all
subsequent copies and derivative works made from that copy.
This option is useful when you wish to copy part of the code of
the Library into a program that is not a library.
4. You may copy and distribute the Library (or a portion or
derivative of it, under Section 2) in object code or executable form
under the terms of Sections 1 and 2 above provided that you accompany
it with the complete corresponding machine-readable source code, which
must be distributed under the terms of Sections 1 and 2 above on a
medium customarily used for software interchange.
If distribution of object code is made by offering access to copy
from a designated place, then offering equivalent access to copy the
source code from the same place satisfies the requirement to
distribute the source code, even though third parties are not
compelled to copy the source along with the object code.
5. A program that contains no derivative of any portion of the
Library, but is designed to work with the Library by being compiled or
linked with it, is called a "work that uses the Library". Such a
work, in isolation, is not a derivative work of the Library, and
therefore falls outside the scope of this License.
However, linking a "work that uses the Library" with the Library
creates an executable that is a derivative of the Library (because it
contains portions of the Library), rather than a "work that uses the
library". The executable is therefore covered by this License.
Section 6 states terms for distribution of such executables.
When a "work that uses the Library" uses material from a header file
that is part of the Library, the object code for the work may be a
derivative work of the Library even though the source code is not.
Whether this is true is especially significant if the work can be
linked without the Library, or if the work is itself a library. The
threshold for this to be true is not precisely defined by law.
If such an object file uses only numerical parameters, data
structure layouts and accessors, and small macros and small inline
functions (ten lines or less in length), then the use of the object
file is unrestricted, regardless of whether it is legally a derivative
work. (Executables containing this object code plus portions of the
Library will still fall under Section 6.)
Otherwise, if the work is a derivative of the Library, you may
distribute the object code for the work under the terms of Section 6.
Any executables containing that work also fall under Section 6,
whether or not they are linked directly with the Library itself.
6. As an exception to the Sections above, you may also compile or
link a "work that uses the Library" with the Library to produce a
work containing portions of the Library, and distribute that work
under terms of your choice, provided that the terms permit
modification of the work for the customer's own use and reverse
engineering for debugging such modifications.
You must give prominent notice with each copy of the work that the
Library is used in it and that the Library and its use are covered by
this License. You must supply a copy of this License. If the work
during execution displays copyright notices, you must include the
copyright notice for the Library among them, as well as a reference
directing the user to the copy of this License. Also, you must do one
of these things:
a) Accompany the work with the complete corresponding
machine-readable source code for the Library including whatever
changes were used in the work (which must be distributed under
Sections 1 and 2 above); and, if the work is an executable linked
with the Library, with the complete machine-readable "work that
uses the Library", as object code and/or source code, so that the
user can modify the Library and then relink to produce a modified
executable containing the modified Library. (It is understood
that the user who changes the contents of definitions files in the
Library will not necessarily be able to recompile the application
to use the modified definitions.)
b) Accompany the work with a written offer, valid for at
least three years, to give the same user the materials
specified in Subsection 6a, above, for a charge no more
than the cost of performing this distribution.
c) If distribution of the work is made by offering access to copy
from a designated place, offer equivalent access to copy the above
specified materials from the same place.
d) Verify that the user has already received a copy of these
materials or that you have already sent this user a copy.
For an executable, the required form of the "work that uses the
Library" must include any data and utility programs needed for
reproducing the executable from it. However, as a special exception,
the source code distributed need not include anything that is normally
distributed (in either source or binary form) with the major
components (compiler, kernel, and so on) of the operating system on
which the executable runs, unless that component itself accompanies
the executable.
It may happen that this requirement contradicts the license
restrictions of other proprietary libraries that do not normally
accompany the operating system. Such a contradiction means you cannot
use both them and the Library together in an executable that you
distribute.
7. You may place library facilities that are a work based on the
Library side-by-side in a single library together with other library
facilities not covered by this License, and distribute such a combined
library, provided that the separate distribution of the work based on
the Library and of the other library facilities is otherwise
permitted, and provided that you do these two things:
a) Accompany the combined library with a copy of the same work
based on the Library, uncombined with any other library
facilities. This must be distributed under the terms of the
Sections above.
b) Give prominent notice with the combined library of the fact
that part of it is a work based on the Library, and explaining
where to find the accompanying uncombined form of the same work.
8. You may not copy, modify, sublicense, link with, or distribute
the Library except as expressly provided under this License. Any
attempt otherwise to copy, modify, sublicense, link with, or
distribute the Library is void, and will automatically terminate your
rights under this License. However, parties who have received copies,
or rights, from you under this License will not have their licenses
terminated so long as such parties remain in full compliance.
9. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Library or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Library (or any work based on the
Library), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Library or works based on it.
10. Each time you redistribute the Library (or any work based on the
Library), the recipient automatically receives a license from the
original licensor to copy, distribute, link with or modify the Library
subject to these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
11. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Library at all. For example, if a patent
license would not permit royalty-free redistribution of the Library by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Library.
If any portion of this section is held invalid or unenforceable under any
particular circumstance, the balance of the section is intended to apply,
and the section as a whole is intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
12. If the distribution and/or use of the Library is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Library under this License may add
an explicit geographical distribution limitation excluding those countries,
so that distribution is permitted only in or among countries not thus
excluded. In such case, this License incorporates the limitation as if
written in the body of this License.
13. The Free Software Foundation may publish revised and/or new
versions of the Library General Public License from time to time.
Such new versions will be similar in spirit to the present version,
but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Library
specifies a version number of this License which applies to it and
"any later version", you have the option of following the terms and
conditions either of that version or of any later version published by
the Free Software Foundation. If the Library does not specify a
license version number, you may choose any version ever published by
the Free Software Foundation.
14. If you wish to incorporate parts of the Library into other free
programs whose distribution conditions are incompatible with these,
write to the author to ask for permission. For software which is
copyrighted by the Free Software Foundation, write to the Free
Software Foundation; we sometimes make exceptions for this. Our
decision will be guided by the two goals of preserving the free status
of all derivatives of our free software and of promoting the sharing
and reuse of software generally.
NO WARRANTY
15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO
WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR
OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY
KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE
LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME
THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY
AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU
FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR
CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING
RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A
FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF
SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH
DAMAGES.
END OF TERMS AND CONDITIONS
Appendix: How to Apply These Terms to Your New Libraries
If you develop a new library, and you want it to be of the greatest
possible use to the public, we recommend making it free software that
everyone can redistribute and change. You can do so by permitting
redistribution under these terms (or, alternatively, under the terms of the
ordinary General Public License).
To apply these terms, attach the following notices to the library. It is
safest to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least the
"copyright" line and a pointer to where the full notice is found.
<one line to give the library's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Library General Public
License as published by the Free Software Foundation; either
version 2 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Library General Public License for more details.
You should have received a copy of the GNU Library General Public
License along with this library; if not, write to the Free
Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA.
Also add information on how to contact you by electronic and paper mail.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the library, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the
library `Frob' (a library for tweaking knobs) written by James Random Hacker.
<signature of Ty Coon>, 1 April 1990
Ty Coon, President of Vice
That's all there is to it!

481
COPYING.LIB Normal file
View file

@ -0,0 +1,481 @@
GNU LIBRARY GENERAL PUBLIC LICENSE
Version 2, June 1991
Copyright (C) 1991 Free Software Foundation, Inc.
51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
[This is the first released version of the library GPL. It is
numbered 2 because it goes with version 2 of the ordinary GPL.]
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
Licenses are intended to guarantee your freedom to share and change
free software--to make sure the software is free for all its users.
This license, the Library General Public License, applies to some
specially designated Free Software Foundation software, and to any
other libraries whose authors decide to use it. You can use it for
your libraries, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if
you distribute copies of the library, or if you modify it.
For example, if you distribute copies of the library, whether gratis
or for a fee, you must give the recipients all the rights that we gave
you. You must make sure that they, too, receive or can get the source
code. If you link a program with the library, you must provide
complete object files to the recipients so that they can relink them
with the library, after making changes to the library and recompiling
it. And you must show them these terms so they know their rights.
Our method of protecting your rights has two steps: (1) copyright
the library, and (2) offer you this license which gives you legal
permission to copy, distribute and/or modify the library.
Also, for each distributor's protection, we want to make certain
that everyone understands that there is no warranty for this free
library. If the library is modified by someone else and passed on, we
want its recipients to know that what they have is not the original
version, so that any problems introduced by others will not reflect on
the original authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that companies distributing free
software will individually obtain patent licenses, thus in effect
transforming the program into proprietary software. To prevent this,
we have made it clear that any patent must be licensed for everyone's
free use or not licensed at all.
Most GNU software, including some libraries, is covered by the ordinary
GNU General Public License, which was designed for utility programs. This
license, the GNU Library General Public License, applies to certain
designated libraries. This license is quite different from the ordinary
one; be sure to read it in full, and don't assume that anything in it is
the same as in the ordinary license.
The reason we have a separate public license for some libraries is that
they blur the distinction we usually make between modifying or adding to a
program and simply using it. Linking a program with a library, without
changing the library, is in some sense simply using the library, and is
analogous to running a utility program or application program. However, in
a textual and legal sense, the linked executable is a combined work, a
derivative of the original library, and the ordinary General Public License
treats it as such.
Because of this blurred distinction, using the ordinary General
Public License for libraries did not effectively promote software
sharing, because most developers did not use the libraries. We
concluded that weaker conditions might promote sharing better.
However, unrestricted linking of non-free programs would deprive the
users of those programs of all benefit from the free status of the
libraries themselves. This Library General Public License is intended to
permit developers of non-free programs to use free libraries, while
preserving your freedom as a user of such programs to change the free
libraries that are incorporated in them. (We have not seen how to achieve
this as regards changes in header files, but we have achieved it as regards
changes in the actual functions of the Library.) The hope is that this
will lead to faster development of free libraries.
The precise terms and conditions for copying, distribution and
modification follow. Pay close attention to the difference between a
"work based on the library" and a "work that uses the library". The
former contains code derived from the library, while the latter only
works together with the library.
Note that it is possible for a library to be covered by the ordinary
General Public License rather than by this special one.
GNU LIBRARY GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License Agreement applies to any software library which
contains a notice placed by the copyright holder or other authorized
party saying it may be distributed under the terms of this Library
General Public License (also called "this License"). Each licensee is
addressed as "you".
A "library" means a collection of software functions and/or data
prepared so as to be conveniently linked with application programs
(which use some of those functions and data) to form executables.
The "Library", below, refers to any such software library or work
which has been distributed under these terms. A "work based on the
Library" means either the Library or any derivative work under
copyright law: that is to say, a work containing the Library or a
portion of it, either verbatim or with modifications and/or translated
straightforwardly into another language. (Hereinafter, translation is
included without limitation in the term "modification".)
"Source code" for a work means the preferred form of the work for
making modifications to it. For a library, complete source code means
all the source code for all modules it contains, plus any associated
interface definition files, plus the scripts used to control compilation
and installation of the library.
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running a program using the Library is not restricted, and output from
such a program is covered only if its contents constitute a work based
on the Library (independent of the use of the Library in a tool for
writing it). Whether that is true depends on what the Library does
and what the program that uses the Library does.
1. You may copy and distribute verbatim copies of the Library's
complete source code as you receive it, in any medium, provided that
you conspicuously and appropriately publish on each copy an
appropriate copyright notice and disclaimer of warranty; keep intact
all the notices that refer to this License and to the absence of any
warranty; and distribute a copy of this License along with the
Library.
You may charge a fee for the physical act of transferring a copy,
and you may at your option offer warranty protection in exchange for a
fee.
2. You may modify your copy or copies of the Library or any portion
of it, thus forming a work based on the Library, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) The modified work must itself be a software library.
b) You must cause the files modified to carry prominent notices
stating that you changed the files and the date of any change.
c) You must cause the whole of the work to be licensed at no
charge to all third parties under the terms of this License.
d) If a facility in the modified Library refers to a function or a
table of data to be supplied by an application program that uses
the facility, other than as an argument passed when the facility
is invoked, then you must make a good faith effort to ensure that,
in the event an application does not supply such function or
table, the facility still operates, and performs whatever part of
its purpose remains meaningful.
(For example, a function in a library to compute square roots has
a purpose that is entirely well-defined independent of the
application. Therefore, Subsection 2d requires that any
application-supplied function or table used by this function must
be optional: if the application does not supply it, the square
root function must still compute square roots.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Library,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Library, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote
it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Library.
In addition, mere aggregation of another work not based on the Library
with the Library (or with a work based on the Library) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may opt to apply the terms of the ordinary GNU General Public
License instead of this License to a given copy of the Library. To do
this, you must alter all the notices that refer to this License, so
that they refer to the ordinary GNU General Public License, version 2,
instead of to this License. (If a newer version than version 2 of the
ordinary GNU General Public License has appeared, then you can specify
that version instead if you wish.) Do not make any other change in
these notices.
Once this change is made in a given copy, it is irreversible for
that copy, so the ordinary GNU General Public License applies to all
subsequent copies and derivative works made from that copy.
This option is useful when you wish to copy part of the code of
the Library into a program that is not a library.
4. You may copy and distribute the Library (or a portion or
derivative of it, under Section 2) in object code or executable form
under the terms of Sections 1 and 2 above provided that you accompany
it with the complete corresponding machine-readable source code, which
must be distributed under the terms of Sections 1 and 2 above on a
medium customarily used for software interchange.
If distribution of object code is made by offering access to copy
from a designated place, then offering equivalent access to copy the
source code from the same place satisfies the requirement to
distribute the source code, even though third parties are not
compelled to copy the source along with the object code.
5. A program that contains no derivative of any portion of the
Library, but is designed to work with the Library by being compiled or
linked with it, is called a "work that uses the Library". Such a
work, in isolation, is not a derivative work of the Library, and
therefore falls outside the scope of this License.
However, linking a "work that uses the Library" with the Library
creates an executable that is a derivative of the Library (because it
contains portions of the Library), rather than a "work that uses the
library". The executable is therefore covered by this License.
Section 6 states terms for distribution of such executables.
When a "work that uses the Library" uses material from a header file
that is part of the Library, the object code for the work may be a
derivative work of the Library even though the source code is not.
Whether this is true is especially significant if the work can be
linked without the Library, or if the work is itself a library. The
threshold for this to be true is not precisely defined by law.
If such an object file uses only numerical parameters, data
structure layouts and accessors, and small macros and small inline
functions (ten lines or less in length), then the use of the object
file is unrestricted, regardless of whether it is legally a derivative
work. (Executables containing this object code plus portions of the
Library will still fall under Section 6.)
Otherwise, if the work is a derivative of the Library, you may
distribute the object code for the work under the terms of Section 6.
Any executables containing that work also fall under Section 6,
whether or not they are linked directly with the Library itself.
6. As an exception to the Sections above, you may also compile or
link a "work that uses the Library" with the Library to produce a
work containing portions of the Library, and distribute that work
under terms of your choice, provided that the terms permit
modification of the work for the customer's own use and reverse
engineering for debugging such modifications.
You must give prominent notice with each copy of the work that the
Library is used in it and that the Library and its use are covered by
this License. You must supply a copy of this License. If the work
during execution displays copyright notices, you must include the
copyright notice for the Library among them, as well as a reference
directing the user to the copy of this License. Also, you must do one
of these things:
a) Accompany the work with the complete corresponding
machine-readable source code for the Library including whatever
changes were used in the work (which must be distributed under
Sections 1 and 2 above); and, if the work is an executable linked
with the Library, with the complete machine-readable "work that
uses the Library", as object code and/or source code, so that the
user can modify the Library and then relink to produce a modified
executable containing the modified Library. (It is understood
that the user who changes the contents of definitions files in the
Library will not necessarily be able to recompile the application
to use the modified definitions.)
b) Accompany the work with a written offer, valid for at
least three years, to give the same user the materials
specified in Subsection 6a, above, for a charge no more
than the cost of performing this distribution.
c) If distribution of the work is made by offering access to copy
from a designated place, offer equivalent access to copy the above
specified materials from the same place.
d) Verify that the user has already received a copy of these
materials or that you have already sent this user a copy.
For an executable, the required form of the "work that uses the
Library" must include any data and utility programs needed for
reproducing the executable from it. However, as a special exception,
the source code distributed need not include anything that is normally
distributed (in either source or binary form) with the major
components (compiler, kernel, and so on) of the operating system on
which the executable runs, unless that component itself accompanies
the executable.
It may happen that this requirement contradicts the license
restrictions of other proprietary libraries that do not normally
accompany the operating system. Such a contradiction means you cannot
use both them and the Library together in an executable that you
distribute.
7. You may place library facilities that are a work based on the
Library side-by-side in a single library together with other library
facilities not covered by this License, and distribute such a combined
library, provided that the separate distribution of the work based on
the Library and of the other library facilities is otherwise
permitted, and provided that you do these two things:
a) Accompany the combined library with a copy of the same work
based on the Library, uncombined with any other library
facilities. This must be distributed under the terms of the
Sections above.
b) Give prominent notice with the combined library of the fact
that part of it is a work based on the Library, and explaining
where to find the accompanying uncombined form of the same work.
8. You may not copy, modify, sublicense, link with, or distribute
the Library except as expressly provided under this License. Any
attempt otherwise to copy, modify, sublicense, link with, or
distribute the Library is void, and will automatically terminate your
rights under this License. However, parties who have received copies,
or rights, from you under this License will not have their licenses
terminated so long as such parties remain in full compliance.
9. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Library or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Library (or any work based on the
Library), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Library or works based on it.
10. Each time you redistribute the Library (or any work based on the
Library), the recipient automatically receives a license from the
original licensor to copy, distribute, link with or modify the Library
subject to these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
11. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Library at all. For example, if a patent
license would not permit royalty-free redistribution of the Library by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Library.
If any portion of this section is held invalid or unenforceable under any
particular circumstance, the balance of the section is intended to apply,
and the section as a whole is intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
12. If the distribution and/or use of the Library is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Library under this License may add
an explicit geographical distribution limitation excluding those countries,
so that distribution is permitted only in or among countries not thus
excluded. In such case, this License incorporates the limitation as if
written in the body of this License.
13. The Free Software Foundation may publish revised and/or new
versions of the Library General Public License from time to time.
Such new versions will be similar in spirit to the present version,
but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Library
specifies a version number of this License which applies to it and
"any later version", you have the option of following the terms and
conditions either of that version or of any later version published by
the Free Software Foundation. If the Library does not specify a
license version number, you may choose any version ever published by
the Free Software Foundation.
14. If you wish to incorporate parts of the Library into other free
programs whose distribution conditions are incompatible with these,
write to the author to ask for permission. For software which is
copyrighted by the Free Software Foundation, write to the Free
Software Foundation; we sometimes make exceptions for this. Our
decision will be guided by the two goals of preserving the free status
of all derivatives of our free software and of promoting the sharing
and reuse of software generally.
NO WARRANTY
15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO
WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR
OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY
KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE
LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME
THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY
AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU
FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR
CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING
RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A
FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF
SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH
DAMAGES.
END OF TERMS AND CONDITIONS
Appendix: How to Apply These Terms to Your New Libraries
If you develop a new library, and you want it to be of the greatest
possible use to the public, we recommend making it free software that
everyone can redistribute and change. You can do so by permitting
redistribution under these terms (or, alternatively, under the terms of the
ordinary General Public License).
To apply these terms, attach the following notices to the library. It is
safest to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least the
"copyright" line and a pointer to where the full notice is found.
<one line to give the library's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Library General Public
License as published by the Free Software Foundation; either
version 2 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Library General Public License for more details.
You should have received a copy of the GNU Library General Public
License along with this library; if not, write to the Free
Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA.
Also add information on how to contact you by electronic and paper mail.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the library, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the
library `Frob' (a library for tweaking knobs) written by James Random Hacker.
<signature of Ty Coon>, 1 April 1990
Ty Coon, President of Vice
That's all there is to it!

28736
ChangeLog Normal file

File diff suppressed because it is too large Load diff

299
NEWS Normal file
View file

@ -0,0 +1,299 @@
GStreamer 1.20 Release Notes
GStreamer 1.20 has not been released yet. It is scheduled for release
around October/November 2021.
1.19.x is the unstable development version that is being developed in
the git main branch and which will eventually result in 1.20, and 1.19.2
is the current development release in that series
It is expected that feature freeze will be in early October 2021,
followed by one or two 1.19.9x pre-releases and the new 1.20 stable
release around October/November 2021.
1.20 will be backwards-compatible to the stable 1.18, 1.16, 1.14, 1.12,
1.10, 1.8, 1.6,, 1.4, 1.2 and 1.0 release series.
See https://gstreamer.freedesktop.org/releases/1.20/ for the latest
version of this document.
Last updated: Wednesday 22 September 2021, 18:00 UTC (log)
Introduction
The GStreamer team is proud to announce a new major feature release in
the stable 1.x API series of your favourite cross-platform multimedia
framework!
As always, this release is again packed with many new features, bug
fixes and other improvements.
Highlights
- this section will be completed in due course
Major new features and changes
Noteworthy new features and API
- this section will be filled in in due course
New elements
- this section will be filled in in due course
New element features and additions
- this section will be filled in in due course
Plugin and library moves
- this section will be filled in in due course
- There were no plugin moves or library moves in this cycle.
Plugin removals
The following elements or plugins have been removed:
- this section will be filled in in due course
Miscellaneous API additions
- this section will be filled in in due course
Miscellaneous performance, latency and memory optimisations
- this section will be filled in in due course
Miscellaneous other changes and enhancements
- this section will be filled in in due course
Tracing framework and debugging improvements
- this section will be filled in in due course
Tools
- this section will be filled in in due course
GStreamer RTSP server
- this section will be filled in in due course
GStreamer VAAPI
- this section will be filled in in due course
GStreamer OMX
- this section will be filled in in due course
GStreamer Editing Services and NLE
- this section will be filled in in due course
GStreamer validate
- this section will be filled in in due course
GStreamer Python Bindings
- this section will be filled in in due course
GStreamer C# Bindings
- this section will be filled in in due course
GStreamer Rust Bindings and Rust Plugins
The GStreamer Rust bindings are released separately with a different
release cadence thats tied to gtk-rs, but the latest release has
already been updated for the upcoming new GStreamer 1.20 API.
gst-plugins-rs, the module containing GStreamer plugins written in Rust,
has also seen lots of activity with many new elements and plugins.
What follows is a list of elements and plugins available in
gst-plugins-rs, so people dont miss out on all those potentially useful
elements that have no C equivalent.
- FIXME: add new elements
Rust audio plugins
- audiornnoise: New element for audio denoising which implements the
noise removal algorithm of the Xiph RNNoise library, in Rust
- rsaudioecho: Port of the audioecho element from gst-plugins-good
rsaudioloudnorm: Live audio loudness normalization element based on
the FFmpeg af_loudnorm filter
- claxondec: FLAC lossless audio codec decoder element based on the
pure-Rust claxon implementation
- csoundfilter: Audio filter that can use any filter defined via the
Csound audio programming language
- lewtondec: Vorbis audio decoder element based on the pure-Rust
lewton implementation
Rust video plugins
- cdgdec/cdgparse: Decoder and parser for the CD+G video codec based
on a pure-Rust CD+G implementation, used for example by karaoke CDs
- cea608overlay: CEA-608 Closed Captions overlay element
- cea608tott: CEA-608 Closed Captions to timed-text (e.g. VTT or SRT
subtitles) converter
- tttocea608: CEA-608 Closed Captions from timed-text converter
- mccenc/mccparse: MacCaption Closed Caption format encoder and parser
- sccenc/sccparse: Scenarist Closed Caption format encoder and parser
- dav1dec: AV1 video decoder based on the dav1d decoder implementation
by the VLC project
- rav1enc: AV1 video encoder based on the fast and pure-Rust rav1e
encoder implementation
- rsflvdemux: Alternative to the flvdemux FLV demuxer element from
gst-plugins-good, not feature-equivalent yet
- rsgifenc/rspngenc: GIF/PNG encoder elements based on the pure-Rust
implementations by the image-rs project
Rust text plugins
- textwrap: Element for line-wrapping timed text (e.g. subtitles) for
better screen-fitting, including hyphenation support for some
languages
Rust network plugins
- reqwesthttpsrc: HTTP(S) source element based on the Rust
reqwest/hyper HTTP implementations and almost feature-equivalent
with the main GStreamer HTTP source souphttpsrc
- s3src/s3sink: Source/sink element for the Amazon S3 cloud storage
- awstranscriber: Live audio to timed text transcription element using
the Amazon AWS Transcribe API
Generic Rust plugins
- sodiumencrypter/sodiumdecrypter: Encryption/decryption element based
on libsodium/NaCl
- togglerecord: Recording element that allows to pause/resume
recordings easily and considers keyframe boundaries
- fallbackswitch/fallbacksrc: Elements for handling potentially
failing (network) sources, restarting them on errors/timeout and
showing a fallback stream instead
- threadshare: Set of elements that provide alternatives for various
existing GStreamer elements but allow to share the streaming threads
between each other to reduce the number of threads
- rsfilesrc/rsfilesink: File source/sink elements as replacements for
the existing filesrc/filesink elements
Build and Dependencies
- this section will be filled in in due course
gst-build
- this section will be filled in in due course
Cerbero
Cerbero is a meta build system used to build GStreamer plus dependencies
on platforms where dependencies are not readily available, such as
Windows, Android, iOS and macOS.
General improvements
- this section will be filled in in due course
macOS / iOS
- this section will be filled in in due course
Windows
- this section will be filled in in due course
Windows MSI installer
- this section will be filled in in due course
Linux
- this section will be filled in in due course
Android
- this section will be filled in in due course
Platform-specific changes and improvements
Android
- this section will be filled in in due course
macOS and iOS
- this section will be filled in in due course
Windows
- this section will be filled in in due course
Linux
- this section will be filled in in due course
Documentation improvements
- this section will be filled in in due course
Possibly Breaking Changes
- this section will be filled in in due course
- MPEG-TS SCTE-35 API changes (FIXME: flesh out)
- gst_parse_launch() and friends now error out on non-existing
properties on top-level bins where they would silently fail and
ignore those before.
Known Issues
- this section will be filled in in due course
- There are a couple of known WebRTC-related regressions/blockers:
- webrtc: DTLS setup with Chrome is broken
- webrtcbin: First keyframe is usually lost
Contributors
- this section will be filled in in due course
… and many others who have contributed bug reports, translations, sent
suggestions or helped testing.
Stable 1.20 branch
After the 1.20.0 release there will be several 1.20.x bug-fix releases
which will contain bug fixes which have been deemed suitable for a
stable branch, but no new features or intrusive changes will be added to
a bug-fix release usually. The 1.20.x bug-fix releases will be made from
the git 1.20 branch, which will be a stable branch.
1.20.0
1.20.0 is scheduled to be released around October/November 2021.
Schedule for 1.22
Our next major feature release will be 1.22, and 1.21 will be the
unstable development version leading up to the stable 1.22 release. The
development of 1.21/1.22 will happen in the git main branch.
The plan for the 1.22 development cycle is yet to be confirmed.
1.22 will be backwards-compatible to the stable 1.20, 1.18, 1.16, 1.14,
1.12, 1.10, 1.8, 1.6, 1.4, 1.2 and 1.0 release series.
------------------------------------------------------------------------
These release notes have been prepared by Tim-Philipp Müller with
contributions from …
License: CC BY-SA 4.0

18
README Normal file
View file

@ -0,0 +1,18 @@
GStreamer Editing Services
--------------------------
This is a high-level library for facilitating the creation of audio/video
non-linear editors.
License:
--------
This package and its contents are licensend under the GNU Lesser General
Public License (LGPL).
Dependencies:
-------------
* GStreamer core
* gst-plugins-base

96
RELEASE Normal file
View file

@ -0,0 +1,96 @@
This is GStreamer gst-editing-services 1.19.2.
GStreamer 1.19 is the development branch leading up to the next major
stable version which will be 1.20.
The 1.19 development series adds new features on top of the 1.18 series and is
part of the API and ABI-stable 1.x release series of the GStreamer multimedia
framework.
Full release notes will one day be found at:
https://gstreamer.freedesktop.org/releases/1.20/
Binaries for Android, iOS, Mac OS X and Windows will usually be provided
shortly after the release.
This module will not be very useful by itself and should be used in conjunction
with other GStreamer modules for a complete multimedia experience.
- gstreamer: provides the core GStreamer libraries and some generic plugins
- gst-plugins-base: a basic set of well-supported plugins and additional
media-specific GStreamer helper libraries for audio,
video, rtsp, rtp, tags, OpenGL, etc.
- gst-plugins-good: a set of well-supported plugins under our preferred
license
- gst-plugins-ugly: a set of well-supported plugins which might pose
problems for distributors
- gst-plugins-bad: a set of plugins of varying quality that have not made
their way into one of core/base/good/ugly yet, for one
reason or another. Many of these are are production quality
elements, but may still be missing documentation or unit
tests; others haven't passed the rigorous quality testing
we expect yet.
- gst-libav: a set of codecs plugins based on the ffmpeg library. This is
where you can find audio and video decoders and encoders
for a wide variety of formats including H.264, AAC, etc.
- gstreamer-vaapi: hardware-accelerated video decoding and encoding using
VA-API on Linux. Primarily for Intel graphics hardware.
- gst-omx: hardware-accelerated video decoding and encoding, primarily for
embedded Linux systems that provide an OpenMax
implementation layer such as the Raspberry Pi.
- gst-rtsp-server: library to serve files or streaming pipelines via RTSP
- gst-editing-services: library an plugins for non-linear editing
==== Download ====
You can find source releases of gstreamer in the download
directory: https://gstreamer.freedesktop.org/src/gstreamer/
The git repository and details how to clone it can be found at
https://gitlab.freedesktop.org/gstreamer/
==== Homepage ====
The project's website is https://gstreamer.freedesktop.org/
==== Support and Bugs ====
We have recently moved from GNOME Bugzilla to GitLab on freedesktop.org
for bug reports and feature requests:
https://gitlab.freedesktop.org/gstreamer
Please submit patches via GitLab as well, in form of Merge Requests. See
https://gstreamer.freedesktop.org/documentation/contribute/
for more details.
For help and support, please subscribe to and send questions to the
gstreamer-devel mailing list (see below for details).
There is also a #gstreamer IRC channel on the Freenode IRC network.
==== Developers ====
GStreamer source code repositories can be found on GitLab on freedesktop.org:
https://gitlab.freedesktop.org/gstreamer
and can also be cloned from there and this is also where you can submit
Merge Requests or file issues for bugs or feature requests.
Interested developers of the core library, plugins, and applications should
subscribe to the gstreamer-devel list:
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

View file

@ -0,0 +1,71 @@
# GStreamer
#
# Copyright (C) 2013 Thibault Saunier <tsaunier@gnome.org
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Library General Public
# License as published by the Free Software Foundation; either
# version 2 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Library General Public License for more details.
#
# You should have received a copy of the GNU Library General Public
# License along with this library; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Suite 500,
# Boston, MA 02110-1335, USA.
import os
from gi.repository import Gst, GES, GLib
class Simple:
def __init__(self, uri):
timeline = GES.Timeline.new_audio_video()
self.project = timeline.get_asset()
self.project.connect("asset-added", self._asset_added_cb)
self.project.connect("error-loading-asset", self._error_loading_asset_cb)
self.project.create_asset(uri, GES.UriClip)
self.layer = timeline.append_layer()
self._create_pipeline(timeline)
self.loop = GLib.MainLoop()
def _create_pipeline(self, timeline):
self.pipeline = GES.Pipeline()
self.pipeline.set_timeline(timeline)
bus = self.pipeline.get_bus()
bus.add_signal_watch()
bus.connect("message", self.bus_message_cb)
def bus_message_cb(self, unused_bus, message):
if message.type == Gst.MessageType.EOS:
print "eos"
self.loop.quit()
elif message.type == Gst.MessageType.ERROR:
error = message.parse_error()
print "error %s" % error[1]
self.loop.quit()
def start(self):
self.loop.run()
def _asset_added_cb(self, project, asset):
self.layer.add_asset(asset, 0, 0, Gst.SECOND * 5, GES.TrackType.UNKNOWN)
self.pipeline.set_state(Gst.State.PLAYING)
def _error_loading_asset_cb(self, project, error, asset_id, type):
print "Could not load asset %s: %s" % (asset_id, error)
self.loop.quit()
if __name__ == "__main__":
if len(os.sys.argv) != 2:
print "You must specify a file URI"
exit(-1)
Gst.init(None)
GES.init()
simple = Simple(os.sys.argv[1])
simple.start()

View file

@ -0,0 +1,98 @@
# -*- Mode: Python; py-indent-offset: 4 -*-
# vim: tabstop=4 shiftwidth=4 expandtab
#
# GES.py
#
# Copyright (C) 2012 Thibault Saunier <thibault.saunier@collabora.com>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the
# Free Software Foundation, Inc., 59 Temple Place - Suite 330,
# Boston, MA 02111-1307, USA.
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3, or (at your option)
# any later version.
import sys
from ..overrides import override
from ..importer import modules
from gi.repository import GObject
if sys.version_info >= (3, 0):
_basestring = str
_callable = lambda c: hasattr(c, '__call__')
else:
_basestring = basestring
_callable = callable
GES = modules['GES']._introspection_module
__all__ = []
if GES._version == '0.10':
import warnings
warn_msg = "You have imported the GES 0.10 module. Because GES 0.10 \
was not designed for use with introspection some of the \
interfaces and API will fail. As such this is not supported \
by the GStreamer development team and we encourage you to \
port your app to GES 1 or greater. static python bindings is the recomended \
python module to use with GES 0.10"
warnings.warn(warn_msg, RuntimeWarning)
def __timeline_element__repr__(self):
return "%s [%s (%s) %s]" % (
self.props.name,
Gst.TIME_ARGS(self.props.start),
Gst.TIME_ARGS(self.props.in_point),
Gst.TIME_ARGS(self.props.duration),
)
__prev_set_child_property = GES.TimelineElement.set_child_property
def __timeline_element_set_child_property(self, prop_name, prop_value):
res, _, pspec = GES.TimelineElement.lookup_child(self, prop_name)
if not res:
return res
v = GObject.Value()
v.init(pspec.value_type)
v.set_value(prop_value)
return __prev_set_child_property(self, prop_name, v)
GES.TimelineElement.__repr__ = __timeline_element__repr__
GES.TimelineElement.set_child_property = __timeline_element_set_child_property
GES.TrackElement.set_child_property = GES.TimelineElement.set_child_property
GES.Container.edit = GES.TimelineElement.edit
__prev_asset_repr = GES.Asset.__repr__
def __asset__repr__(self):
return "%s(%s)" % (__prev_asset_repr(self), self.props.id)
GES.Asset.__repr__ = __asset__repr__
def __timeline_iter_clips(self):
"""Iterate all clips in a timeline"""
for layer in self.get_layers():
for clip in layer.get_clips():
yield clip
GES.Timeline.iter_clips = __timeline_iter_clips
try:
from gi.repository import Gst
Gst
except:
raise RuntimeError("GSt couldn't be imported, make sure you have gst-python installed")

View file

@ -0,0 +1,28 @@
#!/usr/bin/env python
#
# __init__.py
#
# Copyright (C) 2012 Thibault Saunier <thibaul.saunier@collabora.com>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the
# Free Software Foundation, Inc., 59 Temple Place - Suite 330,
# Boston, MA 02111-1307, USA.
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3, or (at your option)
# any later version.
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)

View file

@ -0,0 +1 @@
install_data(['gi/overrides/GES.py'], install_dir: pygi_override_dir)

View file

@ -0,0 +1,235 @@
# GStreamer
# Copyright (C) 2015 Mathieu Duponchelle <mathieu.duponchelle@opencreed.com>
#
# bash/zsh completion support for ges-launch
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Library General Public
# License as published by the Free Software Foundation; either
# version 2 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Library General Public License for more details.
#
# You should have received a copy of the GNU Library General Public
# License along with this library; if not, write to the
# Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
# Boston, MA 02110-1301, USA.
HELPERDIR="${BASH_SOURCE[0]%/*}/../helpers"
if [[ ! -d "$HELPERDIR" ]]; then
HELPERDIR="$(pkg-config --variable=bashhelpersdir gstreamer-1.0)"
else
HELPERDIR=`cd "$HELPERDIR"; pwd`
fi
# Common definitions
. "$HELPERDIR"/gst
HELPER="$_GST_HELPER"
_list_commands ()
{
ges-launch-1.0 help | grep '^ +' | cut -d' ' -f3
}
_ges___inspect_action_type ()
{
COMPREPLY=( $(compgen -W "$(ges-launch-1.0 --inspect-action-type | grep '^[^ ]' | cut -d':' -f2)" -- $cur) )
}
_ges___track_types ()
{
COMPREPLY=( $(compgen -W "audio video audio+video" -- $cur) )
}
_ges___set_scenario () {
COMPREPLY=( $(compgen -W "*.scenario $(gst-validate-1.0 -l | awk '$0=$2' FS=[ RS=])" -- $cur) )
}
_ges___load () {
COMPREPLY=( $(compgen -W "*.xges" -- $cur) )
}
_ges___outputuri () {
COMPREPLY=( $(compgen -W "file://" -- $cur) )
}
_ges___audiosink () {
COMPREPLY=( $(compgen -W "$($HELPER --klass=Sink --sinkcaps='audio/x-raw')" -- $cur) )
}
_ges___videosink () {
COMPREPLY=( $(compgen -W "$($HELPER --klass=Sink --sinkcaps='video/x-raw')" -- $cur) )
}
_ges_clip () {
if [[ "$prev" == "$command" ]];
then
_gst_mandatory_argument
else
COMPREPLY=( $(compgen -W "duration= inpoint= start= layer= $(_list_commands)" -- $cur) )
fi
}
_ges_test_clip () {
if [[ "$prev" == "$command" ]];
then
_gst_mandatory_argument
else
COMPREPLY=( $(compgen -W "duration= inpoint= start= layer= $(_list_commands)" -- $cur) )
fi
}
_ges_effect () {
if [[ "$prev" == "$command" ]];
then
_gst_mandatory_argument
else
COMPREPLY=( $(compgen -W "duration= start= layer= $(_list_commands)" -- $cur) )
fi
}
_ges_list_options () {
_gst_all_arguments ges-launch-1.0
}
_ges_list_commands () {
COMPREPLY=( $(compgen -W "$(_list_commands)" -- $cur) )
}
_ges_list_properties () {
local props
if [[ "$real_command" == "" ]]
then
_gst_mandatory_argument
elif [[ "$real_command" == "+clip" ]]
then
COMPREPLY=( $(compgen -W "set-alpha set-posx set-posy set-width set-height set-volume set-mute" -- $cur) )
elif [[ "$real_command" == "+test-clip" ]]
then
COMPREPLY=( $(compgen -W "set-alpha set-posx set-posy set-width set-height set-volume set-mute" -- $cur) )
elif [[ "$real_command" == "+effect" ]]
then
COMPREPLY=()
effect_bin_description="${effect_bin_description//\"/ }"
array=(${effect_bin_description//!/ })
for i in "${array[@]}"; do
props=("$($HELPER --element-properties $i)")
for j in $props; do
j="${j//=/ }"
COMPREPLY+=( $(compgen -W "set-$j" -- $cur) )
done
done
else
_gst_mandatory_argument
fi
}
_ges___exclude_ () { _gst_mandatory_argument; }
_ges___encoding_profile () { _gst_mandatory_argument; }
_ges___ges_sample_path () { _gst_mandatory_argument; }
_ges___ges_sample_path_recurse () { _gst_mandatory_argument; }
_ges___thumbnail () { _gst_mandatory_argument; }
_ges___repeat () { _gst_mandatory_argument; }
_ges___save () { _gst_mandatory_argument; }
containsElement () {
local e
for e in "${@:2}";
do
[[ "$e" == "$1" ]] && return 0;
done
return 1
}
__ges_main ()
{
local i=1 c=1 command function_exists completion_func commands real_command effect_bin_description
commands=($(_list_commands))
real_command=""
effect_bin_description=""
if [[ "$cur" == "=" ]]; then
_gst_mandatory_argument
return
fi
while [[ $i -ne $COMP_CWORD ]];
do
local var
var="${COMP_WORDS[i]}"
if [[ "$var" == "--"* ]]
then
command="$var"
elif containsElement "$var" "${commands[@]}";
then
real_command="$var"
command="$var"
if [[ "$var" == "+effect" ]]
then
effect_bin_description="${COMP_WORDS[i+1]}"
fi
fi
i=$[$i+1]
done
if [[ "$command" == "--gst"* ]]; then
completion_func="_${command//-/_}"
else
completion_func="_ges_${command//-/_}"
completion_func="${completion_func//+/}"
fi
declare -f $completion_func >/dev/null 2>&1
function_exists=$?
if [[ "$cur" == "-"* ]]; then
_ges_list_options
elif [[ "$cur" == "+"* ]]; then
_ges_list_commands
elif [[ "$cur" == "="* ]]
then
_gst_mandatory_argument
elif [[ "$cur" == "set-"* ]]
then
_ges_list_properties
elif [ $function_exists -eq 0 ]
then
$completion_func
else
_ges_list_commands
fi
}
__ges_func_wrap ()
{
local cur prev
cur="${COMP_WORDS[COMP_CWORD]}"
prev="${COMP_WORDS[COMP_CWORD-1]}"
$1
}
# Setup completion for certain functions defined above by setting common
# variables and workarounds.
# This is NOT a public function; use at your own risk.
__ges_complete ()
{
local wrapper="__ges_wrap${2}"
eval "$wrapper () { __ges_func_wrap $2 ; }"
complete -o bashdefault -o default -o nospace -F $wrapper $1 2>/dev/null \
|| complete -o default -o nospace -F $wrapper $1
}
_ges ()
{
__ges_wrap__ges_main
}
__ges_complete ges-launch-1.0 __ges_main

1
docs/base-classes.md Normal file
View file

@ -0,0 +1 @@
# Base classes

3
docs/deprecated.md Normal file
View file

@ -0,0 +1,3 @@
# Deprecated APIS
Those APIs have been deprecated and shouldn't be used in newly written code.

321
docs/design/asset.txt Normal file
View file

@ -0,0 +1,321 @@
Assets
~~~~~~~~~
This draft document describes a possible design for asset objects.
The assets should be used in order to instantiate objects of differents
types.
Terminology: A asset is an object from which objects can be extracted.
Summary
~~~~~~~~~
1. Basic ideas
2. Problems
3. Propositions to solve those problems
4. Use-cases
5. API draft
A. Asset API draft
B. Source asset API draft
C. Project asset API draft
D. Extractable/Asset Interface API draft
E. Methods that should be added to other classes
1. Basic ideas
~~~~~~~~~~~~~~~
Basically, asset is a way of avoiding duplicating data between object and avoid
processing when the same processing would happen several times for an 2 different
objects of a same type.
* There will be a listing of avalaible, ready to use assets
* Asset allow to create some particular types of object that implement the GESExtractable
interface
* Assets will hold metadatas
* Assets can be either, created by the user, or will be created by GES itself
when initializing, there should be a way to disable that feature on demand.
Some ideas of asset(especially for TimelineSource objects) can be found in docs/random/design.
2. Problems (Not in any particular order)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
1) We have avoid to various times the same file on the system
2) We must be able to query assets by some criteria
a. By type of TimelineObject that it can produce
b. By type of supported tracks
c. Should we have filters by some specific properties of source asset
- like duration, width, height, etc?
3) We must be able to get reference to origin asset of any extracted object
4) We need a way to describe projects
5) GESAssets can be instantiated asynchronously
6) The instantiation of a asset can fail
7) Users need to get informations about the instantiation failures
8) User should be able to cancel the creation of a GESAsset (especially
in case of asynchronous Asset creation)
3. Propositions to solve those problems
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
1) We should have an interface that needs to be implemented by classes that need to be extractable.
We can call it GESExtractable. It should be responsible for:
* letting the user get the Asset from which an extracted object comes from
* Making it possible to instantiate a GESAsset only from a GType which means that:
- It needs to contain a reference to a GES_TYPE_ASSET (or subclass) so the proper GESAsset type will be instantiated.
- It need to contain some mapping between the ID (string) of the asset, and the property of the object that is used as its ID.
For a property to be usable as an ID for its asset, each objects extracted from a same asset must have the same value for the property
Examples:
GESTimelineFileSource -> URI
GESTrackParseLaunchEffect -> bin_description
GESProject -> project name / uri of the stored serialized
2) A list of all available, ready to be used assets should be cached and
reused whenever it is possible.
Basically it will look like:
GESAsset.id -> asset
(the ID is computed thanks to the mapping)
4) To allow users to implement some sort of library (media, effects, transitions...)
we must be able to query assets by using some criteria,
e.g. GType of the extractable object, URI, supported track types, etc...
5) We can instantiate a GESAsset only from a GType, the appropriate checks need
to be done and it can return subclasses of GESAsset thanks to the
information included in the GESExtractable interface.
6) Instanciation can happen asyncronously in some cases. For example, a
asset that needs to discover a file to be properly filled needs.
4. Use cases
~~~~~~~~~~~~~
UC-1. Define media files and discover them
UC-2. Define project - reference all assets
UC-3. Define titles
UC-4. Define operations
- Transitions - 1 asset per transition type
- Effects - 1 asset per effects type
- TextOverlay
UC-5. Handle metadata
UC-6. Add operations (only effects?) to a GESTimelineObject
UC-7. User want to 'invent' a new operation, we need to be able
to let him define it
UC-8. The user want to make an object from a GESAsset
5. API Draft
~~~~~~~~~~~~
A. GESExtractable API
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
GESExtractable is responsible for telling what GESAsset subclass need to
be instantiated.
/**
* ges_extractable_object_get_asset:
* @object: Target object
* Method to get asset which was used to instaniate specified object
* Returns: origin asset
*/
GESAsset *
ges_extractable_get_asset(GESExtractable *extractable);
/**
* ges_extractable_object_set_asset:
* @object: Target object
* @asset: (transfer none): The #GESAsset to set
*
* Method to set asset which was used to instaniate specified object
*/
void
ges_extractable_set_asset (GESExtractable * self, GESAsset * asset)
/**
* ges_extractable_get_asset_type:
* @class: Get the #GType of the GESAsset that should be used to extract
* the object that implements that #GESExtractable interface
*
* Lets user know the type of GESAsset that should be used to extract the
* object that implement that interface.
*/
GType
ges_extractable_get_asset_type (GESExtractableClass *class)
/**
* ges_extractable_get_id:
* @self: The #GESExtractable
*
* Returns: The #id of the associated #GESAsset
*/
const gchar *
ges_extractable_get_id (GESExtractable * self)
/**
* ges_extractable_type_get_parameters_for_id:
* @type: The #GType implementing #GESExtractable
* @id: The ID of the Extractable
* @n_params: (out): Return location for the returned array
*
* Returns: (transfer full) (array length=n_params): an array of #GParameter
* needed to extract the #GESExtractable from a #GESAsset of @id
*/
GParameter *
ges_extractable_type_get_parameters_from_id (GType type, const gchar * id,
guint * n_params)
/**
* ges_extractable_type_get_asset_type:
* @type: The #GType implementing #GESExtractable
*
* Get the #GType, subclass of #GES_TYPE_ASSET to instanciate
* to be able to extract a @type
*
* Returns: the #GType to use to create a asset to extract @type
*/
GType
ges_extractable_type_get_asset_type (GType type)
/**
* ges_extractable_type_check_id:
* @type: The #GType implementing #GESExtractable
* @id: The ID to check
*
* Check if @id is valid for @type
*
* Returns: Return %TRUE if @id is valid, %FALSE otherwise
*/
gchar *
ges_extractable_type_check_id (GType type, const gchar * id)
A. Asset And subclasses API draft
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
a) GESAsset
| ~~~~~~~~~~~
|
| Will implement GESMetdata
|
| Virtual method type:
| -------------------
|
| /**
| * GESAssetCreatedCallback:
| * @asset: the #newly created #GESAsset or %NULL if something went wrong
| * @error: The #GError filled if previsouly provided in the constructor or %NULL
| * @user_data: The user data pointer
| *
| * A function that will be called when a #GESAsset is ready to be used.
| */
| typedef void (*GESAssetCreatedCallback)(GESAsset *asset, GError *error, gpointer user_data);
|
|
| Methods prototypes:
| -------------------
| /**
| * ges_asset_request:
| * @extractable_type: The #GType of the object that can be extracted from the new asset.
| * The class must implement the #GESExtractable interface.
| * @callback: a #GAsyncReadyCallback to call when the initialization is finished
| * @id: The Identifier of the asset we want to create. This identifier depends of the extractable
| * type you want. By default it is the name of the class itself (or %NULL), but for example for a
| * GESTrackParseLaunchEffect, it will be the pipeline description, for a GESTimelineFileSource it
| * will be the name of the file, etc... You should refer to the documentation of the #GESExtractable
| * type you want to create a #GESAsset for.
| *
| * Creates a new #GESAsset asyncronously, @callback will be called when the materail is loaded
| *
| * Returns: %TRUE if the asset could be loaded to load %FALSE otherwize
| */
| gboolean
| ges_asset_request (GType extractable_type, GESAssetCreatedCallback callback,
| gpointer user_data, const gchar *id);
|
|->b) GESAssetTimelineObject
| | ~~~~~~~~~~~~~~~~~~~~~~~~~
| | /**
| | * ges_asset_timeline_object_get_track_types:
| | * @asset: a #GESAssetTimelineObject
| | *
| | * Method that returns track types that are supported by given asset
| | *
| | * Returns: Track types that are supported by asset
| | */
| | GESTrackType
| | ges_asset_timeline_object_get_track_types (GESAssetTimelineObject *asset);
| |
| |
| |-> c) GESAssetFileSource
| ~~~~~~~~~~~~~~~~~~~~~
| /**
| * ges_asset_file_source_get_stream_info:
| * @asset: a #GESAsset of extractable_type GES_TIMELINE_FILE_SOURCE
| * Method that returns discoverer data of specified asset so user could work with
| * it directly
| * Returns: discover info of asset
| */
| GstDiscovererStreamInfo *
| ges_asset_file_source_get_stream_info (GESAssetFileSource *asset);
|
|
|-> d) GESProjectAsset asset API
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
A project is a GESAsset that has GES_TYPE_TIMELINE or subclasses as extractable_type
FIXME: This special case that should be thought thoroughly.
/**
* ges_asset_project_list_assets:
* @asset: Project asset
* @type: Type of asset to list
* Method for listing assets of specified type that are available in
* particular project.
*
* Returns: list of available assets of given type in project
*/
ges_asset_project_list_assets (GESAsset *project,
GType type)
E. Methods that should be added to other classes
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/**
* ges_timeline_layer_add_asset:
*
* Creates TimelineObject from asset, adds it to layer and
* returns reference to it.
*
* Returns: Created #GESTimelineObject
*/
GESTimelineObject * ges_timeline_layer_add_asset (GESTimelineLayer *layer,
GESAssetTimelineObject *asset,
GstClockTime start,
GstClockTime inpoint,
GstClockTime duration,
/**
* ges_timeline_remove_extracted_from_asset:
* @timeline: A #GESTimeline from which to remove objects
* @asset: The #GESAssetTimelineObject to remove from @timeline
*
* Removes all asset in @timeline that have been extracted from @asset
*
* Returns: %TRUE if everything could be done properly %FALSE otherwize
*/
gboolean ges_timeline_layer_add_asset (GESTimeline *timeline, GESAsset *asset);
/**
* ges_timeline_object_add_asset:
* @object: Target #GESTimelineObject
* @asset: a #GESAsset that must have a GES_TYPE_TRACK_OPERATION as extractable_type
* @priority: The priority of the new #GESTrackObject
*
* Adds an operation (GESTrackObject(s)) to a GESTimelineObject
*
* Returns: (transfer full):The newly created #GESTrackObject.
*/
GESTrackObject
ges_timeline_object_add_asset (GESTimelineObject *object,
GESAsset *asset,
guint32 priority);

370
docs/design/effects.txt Normal file
View file

@ -0,0 +1,370 @@
Effects
-------
Summary
-------
1. Basic ideas
2. Problems
3. Propositions to solve those problems
A. The registry
B. Effects configurability
C. Keyframes
4. Use-cases
5. API draft
The goal of this proposal is to design a simple way to handle effects through an
API which would allow developers to handle any use-cases.
1. Basic ideas
----------------
* GESTrackEffects are subclasses of GESTrackOperation
* You can add effects on any clip or layer
* You can add effects over several clips and control them as a unique effect.
* Effects are configurable and those properties can change during time
* We must be able to handle third-party effect providers, like the
gnome-video-effects standard.
* We must be able to implement complex effects. This means effects that are
more than adding GstElement-s to the timeline. It can also mean effects
that apply both video and audio changes.
2. Problems
----------
* We must be able to provide a list of effects available on the system at
runtime.
* We must be able to configure effects through an API in GES
withtout having to access the GstElements properties directly.
* We should also expose the GstElement-s contained in an effect so
it is possible for people to control their properties as they wish.
* We must be able to implement and handle complexe effects directly in GES
* We must be able to configure effects through time -> Keyframes without
duplicating code from GStreamer
3. Propositions to solve those problems
---------------------------------------
A. The registry => Still to design
We could implement a GESRegistry which would actually
retrieve elements (effects) from the GSTRegistry and any other mean
such as gnome-video-effects to let us get all the effects that are present
on the system....
This way the developers could have the list of all the effects
that are installed on the system pretty easily.
B. Effects configurability
The idea to be able to configure effects through a simple API in GES would
be to add an API in GESTrackObject to access the gst-object properties that
user would like to configure.
We would also have a method to set those properties easily.
We should also find a way to handle that in the case of systems such as
gnome-effects
C. Keyframes
We may want to handle this use-case directly in GES and for any kind of
time related configuration? FIXME
=> Special specifications for that?
4. Use-cases
-----------
UC-1. The user wants to add an effect to an entire clip => GESTimelineObject
new API
UC-2. The developer wants to allow users to configure effects => New
GESTrackOperation API
UC-3. The user wants to add an effect on a specific portion of a clip, we
should allow him to specify a portion of the clip where the effect should be
applied.
UC-4. We want to implement an effect which isn't only composed by a bin, but
is more complexe than that (ex: "effect '24'") => we have the
GESTrackOperation which is the base class (abstract) for this kind of
implementation. This class should implement vmethods to get/set configurable
properties.
UC-5. A developer wants to implement effect which handle music and video at
the same time, Would the solution be to implement a GESTimelineEffect
to handle this special usecase? FIXME
UC-6. The developers wants to configure each elements of an effect the way
he wants
with a full control over it.
UC-7. Developers want to expose all effects present on the system to the
end-user
5. API draft
------------
A. GESTrackObject new API
signals:
-------
* deep-notify: emited when a usefull property of a GstElement
contained in the GESTrackObject changes
=> DONE
/**
* ges_track_object_list_children_properties:
*
* @object: The origin #GESTrackObject
*
* A convenience method that lists all the usefull configurable properties
* of the GstElement-s contained in @object.
*
* Returns: an array of GParamSpec of the configurable properties of the
* GstElement-s contained in @object or %NULL if a problem occurred.
*/
GParamSpec **
ges_track_object_list_children_properties (GESTrackObject *object);
-> Usecases: Let user know all the property he can configure.
=> Waiting for GESMaterial
/**
* ges_track_object_set_child_property:
*
* @object: The origin #GESTrackObject
* @property_name: The name of the property
* @value: the value
*
* Sets a property of a GstElement contained in @object.
*
*/
void ges_track_object_set_child_property (GESTrackObject *object,
const gchar *property_name,
GValue * value);
-> Usecases:
+ Let user configure effects easily (UC-3)
=> DONE
/**
* ges_track_object_get_child_property:
*
* @object: The origin #GESTrackObject
* @property_name: The name of the property
* @value: return location for the property value
*
* Gets a property of a GstElement contained in @object.
*/
void ges_track_object_get_child_property (GESTrackObject *object,
const gchar *property_name,
GValue * value);
=> DONE
/**
* ges_track_object_get_material:
*
* @object: The origin #GESTrackObject
*
* This is a convenience method to get the #GESMaterial
* from which @object has been made.
*
* Returns: The material from which @object has been made or %NULL
* if @object has been made by another mean
*/
GESMaterial *ges_track_object_get_material (GESTrackObject *object);
=> Waiting for GESMaterial
B. GESTimelineObject new API
signals:
-------
* effect-added: emited when an effect is added
* effect-removed: emited when an effect is removed
=> DONE
/**
* ges_timeline_object_add_effect:
*
* @object: The origin #GESTimelineObject
* @effect_material: The #GESEffect from which to create the effect
* @position: The top position you want to give to the effect,
* -1 if you want it to be added at the end of effects.
*
* Adds a new effect corresponding to @effect_material to the
* #GESTimelineObject
*
* Returns: The newly created #GESTrackEffect, or %NULL if there was an
* error.
*/
GESTrackEffect *ges_timeline_object_add_effect (GESTimelineObject *object,
GESEffect *effect_material,
gint position);
=> Waiting for GESMaterial
/**
* ges_timeline_object_get_effects:
*
* @object: The origin #GESTimelineObject
*
* Returns: a #GList of the #GESTrackEffect that are applied on
* @object order by ascendant priorities.
* The refcount of the objects will be increased. The user will have to
* unref each #GESTrackOperation and free the #GList.
*/
GList *
ges_timeline_object_get_effects (GESTimelineObject *object);
-> Usecases:
+ First step to allow the configuration of effects (UC-3)
=> DONE
/**
* ges_timeline_object_set_top_effect_position:
*
* @object: The origin #GESTimelineObject
* @effect: The #GESTrackEffect to move
* @newposition: the new position at which to move the @effect
*
* Returns: %TRUE if @effect was successfuly moved, %FALSE otherwize.
*/
gboolean
ges_timeline_object_set_top_effect_position (GESTimelineObject *object,
GESTrackEffect *effect, guint newposition);
=> DONE
/**
* ges_timeline_object_get_top_effect_position:
*
* @object: The origin #GESTimelineObject
* @effect: The #GESTrackEffect we want to get the top position from
*
* Gets the top position of an effect.
*
* Returns: The top position of the effect, -1 if something went wrong.
*/
gint
ges_timeline_object_get_top_effect_position (GESTimelineObject *object,
GESTrackEffect *effect);
=> DONE
C - The GESTrackEffect API:
-> This is an empty abstract class
=> DONE
D - The GESTrackParseLaunchEffect API:
This is a parse-launch based implementation of TrackEffect.
/**
* ges_track_parse_launch_effect_new:
*
* @bin_dec: The gst-launch like bin description of the effect
*
* Creates a new #GESTrackEffect from the description of the bin. This is
* a convenience method for testing puposes.
*
* Returns: a newly created #GESTrackEffect, or %NULL if something went
* wrong.
*/
GESTrackEffect *ges_track_parse_launch_effect_new (GESTrackEffect *effect,
const gchar *bin_desc);
=> DONE
E - The GESTrackMaterialEffect API:
/**
* ges_track_material_effect:
*
* @effect_material: The #GESEffect from which to create this
* #GESTrackEffect
*
* Creates a new #GESTrackEffect from a #GESEffect
*
* Returns: a newly created #GESTrackEffect, or %NULL if something went
* wrong.
*/
GESTrackEffect *ges_track_material_effect_new (GESTrackEffect *effect,
GESEffect *effect_material);
=> Waiting for GESMaterial
F - The GESTimelineEffect API:
-> This is an empty abstract class
=> DONE
-> Usecases: The user wants to control multiple effects in sync. The user
wants to add an effect to the whole timeline. The user wants
to had an effect to a segment of the timeline without caring
bout what clip it is applied on.
G - The GESTimelineParseLaunchEffect API:
This is a parse-launch based implementation of TimelineEffect.
/**
* ges_timeline_parse_launch_effect_new_from_bin_desc:
* @video_bin_description: The gst-launch like bin description of the effect
* @audio_bin_description: The gst-launch like bin description of the effect
*
* Creates a new #GESTimelineParseLaunchEffect from the description of the bin.
*
* Returns: a newly created #GESTimelineParseLaunchEffect, or %NULL if something went
* wrong.
*/
GESTimelineParseLaunchEffect *
ges_timeline_parse_launch_effect_new (const gchar * video_bin_description,
const gchar * audio_bin_description)
=> DONE
H - The GESEffect:
The GESEffect class is a subclass of GESMaterial, it is used to describe
effects independently of the usage which is made of it in the timeline.
A GESEffect can specify a GESTrackOperation class to use in a
TimelineObject.
All important properties are inherited from GESMaterial such as:
* Name
* Description
* Tags
* ...
We should also be able to list properties of the effect from the GESMaterial.
=> Waiting for GESMaterial
=================
TODO GESRegistry API:
This should be a singleton since we don't want an app to instanciate more
than one registry. It must be able to get effects from various sources.
We should also make sure any custom effect is detected.
/**
* ges_registry_get_default:
*
* Returns a newly created #GESEffectRegistry or the existing one
* increasing
* its refcount
*/
GESEffectRegistry *
ges_registry_get_default (void);
-> Usecases:
+ Have a registry of all effects that are on the system (UC-8)
/**
* ges_effect_registry_get_effect_list:
*
* @self: The origin #GESEffectRegistry
*
* Returns a #GList of #GESEffectDescriptors. The
*/
GList *
ges_registry_get_effect_list (GESEffectRegistry *self);
-> Usecases:
+ Get all effects descriptors that are on the system (UC-8)

View file

@ -0,0 +1,110 @@
GStreamer: Research into encoding and muxing
--------------------------------------------
Use Cases
---------
This is a list of various use-cases where encoding/muxing is being
used.
* Transcoding
The goal is to convert with as minimal loss of quality any input
file for a target use.
A specific variant of this is transmuxing (see below).
Example applications: Arista, Transmageddon
* Rendering timelines
The incoming streams are a collection of various segments that need
to be rendered.
Those segments can vary in nature (i.e. the video width/height can
change).
This requires the use of identiy with the single-segment property
activated to transform the incoming collection of segments to a
single continuous segment.
Example applications: Pitivi, Jokosher
* Encoding of live sources
The major risk to take into account is the encoder not encoding the
incoming stream fast enough. This is outside of the scope of
encodebin, and should be solved by using queues between the sources
and encodebin, as well as implementing QoS in encoders and sources
(the encoders emitting QoS events, and the upstream elements
adapting themselves accordingly).
Example applications: camerabin, cheese
* Screencasting applications
This is similar to encoding of live sources.
The difference being that due to the nature of the source (size and
amount/frequency of updates) one might want to do the encoding in
two parts:
* The actual live capture is encoded with a 'almost-lossless' codec
(such as huffyuv)
* Once the capture is done, the file created in the first step is
then rendered to the desired target format.
Fixing sources to only emit region-updates and having encoders
capable of encoding those streams would fix the need for the first
step but is outside of the scope of encodebin.
Example applications: Istanbul, gnome-shell, recordmydesktop
* Live transcoding
This is the case of an incoming live stream which will be
broadcasted/transmitted live.
One issue to take into account is to reduce the encoding latency to
a minimum. This should mostly be done by picking low-latency
encoders.
Example applications: Rygel, Coherence
* Transmuxing
Given a certain file, the aim is to remux the contents WITHOUT
decoding into either a different container format or the same
container format.
Remuxing into the same container format is useful when the file was
not created properly (for example, the index is missing).
Whenever available, parsers should be applied on the encoded streams
to validate and/or fix the streams before muxing them.
Metadata from the original file must be kept in the newly created
file.
Example applications: Arista, Transmaggedon
* Loss-less cutting
Given a certain file, the aim is to extract a certain part of the
file without going through the process of decoding and re-encoding
that file.
This is similar to the transmuxing use-case.
Example applications: Pitivi, Transmageddon, Arista, ...
* Multi-pass encoding
Some encoders allow doing a multi-pass encoding.
The initial pass(es) are only used to collect encoding estimates and
are not actually muxed and outputted.
The final pass uses previously collected information, and the output
is then muxed and outputted.
* Archiving and intermediary format
The requirement is to have lossless
* CD ripping
Example applications: Sound-juicer
* DVD ripping
Example application: Thoggen

463
docs/design/encoding.txt Normal file
View file

@ -0,0 +1,463 @@
Encoding and Muxing
-------------------
Summary
-------
A. Problems
B. Goals
1. EncodeBin
2. Encoding Profile System
3. Helper Library for Profiles
A. Problems this proposal attempts to solve
-------------------------------------------
* Duplication of pipeline code for gstreamer-based applications
wishing to encode and or mux streams, leading to subtle differences
and inconsistencies accross those applications.
* No unified system for describing encoding targets for applications
in a user-friendly way.
* No unified system for creating encoding targets for applications,
resulting in duplication of code accross all applications,
differences and inconsistencies that come with that duplication,
and applications hardcoding element names and settings resulting in
poor portability.
B. Goals
--------
1. Convenience encoding element
Create a convenience GstBin for encoding and muxing several streams,
hereafter called 'EncodeBin'.
This element will only contain one single property, which is a
profile.
2. Define a encoding profile system
2. Encoding profile helper library
Create a helper library to:
* create EncodeBin instances based on profiles, and
* help applications to create/load/save/browse those profiles.
1. EncodeBin
------------
1.1 Proposed API
----------------
EncodeBin is a GstBin subclass.
It implements the GstTagSetter interface, by which it will proxy the
calls to the muxer.
Only two introspectable property (i.e. usable without extra API):
* A GstEncodingProfile*
* The name of the profile to use
When a profile is selected, encodebin will:
* Add REQUEST sinkpads for all the GstStreamProfile
* Create the muxer and expose the source pad
Whenever a request pad is created, encodebin will:
* Create the chain of elements for that pad
* Ghost the sink pad
* Return that ghost pad
This allows reducing the code to the minimum for applications
wishing to encode a source for a given profile:
...
encbin = gst_element_factory_make("encodebin, NULL);
g_object_set (encbin, "profile", "N900/H264 HQ", NULL);
gst_element_link (encbin, filesink);
...
vsrcpad = gst_element_get_src_pad(source, "src1");
vsinkpad = gst_element_request_pad_simple (encbin, "video_%d");
gst_pad_link(vsrcpad, vsinkpad);
...
1.2 Explanation of the Various stages in EncodeBin
--------------------------------------------------
This describes the various stages which can happen in order to end
up with a multiplexed stream that can then be stored or streamed.
1.2.1 Incoming streams
The streams fed to EncodeBin can be of various types:
* Video
* Uncompressed (but maybe subsampled)
* Compressed
* Audio
* Uncompressed (audio/x-raw-{int|float})
* Compressed
* Timed text
* Private streams
1.2.2 Steps involved for raw video encoding
(0) Incoming Stream
(1) Transform raw video feed (optional)
Here we modify the various fundamental properties of a raw video
stream to be compatible with the intersection of:
* The encoder GstCaps and
* The specified "Stream Restriction" of the profile/target
The fundamental properties that can be modified are:
* width/height
This is done with a video scaler.
The DAR (Display Aspect Ratio) MUST be respected.
If needed, black borders can be added to comply with the target DAR.
* framerate
* format/colorspace/depth
All of this is done with a colorspace converter
(2) Actual encoding (optional for raw streams)
An encoder (with some optional settings) is used.
(3) Muxing
A muxer (with some optional settings) is used.
(4) Outgoing encoded and muxed stream
1.2.3 Steps involved for raw audio encoding
This is roughly the same as for raw video, expect for (1)
(1) Transform raw audo feed (optional)
We modify the various fundamental properties of a raw audio stream to
be compatible with the intersection of:
* The encoder GstCaps and
* The specified "Stream Restriction" of the profile/target
The fundamental properties that can be modifier are:
* Number of channels
* Type of raw audio (integer or floating point)
* Depth (number of bits required to encode one sample)
1.2.4 Steps involved for encoded audio/video streams
Steps (1) and (2) are replaced by a parser if a parser is available
for the given format.
1.2.5 Steps involved for other streams
Other streams will just be forwarded as-is to the muxer, provided the
muxer accepts the stream type.
2. Encoding Profile System
--------------------------
This work is based on:
* The existing GstPreset system for elements [0]
* The gnome-media GConf audio profile system [1]
* The investigation done into device profiles by Arista and
Transmageddon [2 and 3]
2.2 Terminology
---------------
* Encoding Target Category
A Target Category is a classification of devices/systems/use-cases
for encoding.
Such a classification is required in order for:
* Applications with a very-specific use-case to limit the number of
profiles they can offer the user. A screencasting application has
no use with the online services targets for example.
* Offering the user some initial classification in the case of a
more generic encoding application (like a video editor or a
transcoder).
Ex:
Consumer devices
Online service
Intermediate Editing Format
Screencast
Capture
Computer
* Encoding Profile Target
A Profile Target describes a specific entity for which we wish to
encode.
A Profile Target must belong to at least one Target Category.
It will define at least one Encoding Profile.
Ex (with category):
Nokia N900 (Consumer device)
Sony PlayStation 3 (Consumer device)
Youtube (Online service)
DNxHD (Intermediate editing format)
HuffYUV (Screencast)
Theora (Computer)
* Encoding Profile
A specific combination of muxer, encoders, presets and limitations.
Ex:
Nokia N900/H264 HQ
Ipod/High Quality
DVD/Pal
Youtube/High Quality
HTML5/Low Bandwith
DNxHD
2.3 Encoding Profile
--------------------
An encoding profile requires the following information:
* Name
This string is not translatable and must be unique.
A recommendation to guarantee uniqueness of the naming could be:
<target>/<name>
* Description
This is a translatable string describing the profile
* Muxing format
This is a string containing the GStreamer media-type of the
container format.
* Muxing preset
This is an optional string describing the preset(s) to use on the
muxer.
* Multipass setting
This is a boolean describing whether the profile requires several
passes.
* List of Stream Profile
2.3.1 Stream Profiles
A Stream Profile consists of:
* Type
The type of stream profile (audio, video, text, private-data)
* Encoding Format
This is a string containing the GStreamer media-type of the encoding
format to be used. If encoding is not to be applied, the raw audio
media type will be used.
* Encoding preset
This is an optional string describing the preset(s) to use on the
encoder.
* Restriction
This is an optional GstCaps containing the restriction of the
stream that can be fed to the encoder.
This will generally containing restrictions in video
width/heigh/framerate or audio depth.
* presence
This is an integer specifying how many streams can be used in the
containing profile. 0 means that any number of streams can be
used.
* pass
This is an integer which is only meaningful if the multipass flag
has been set in the profile. If it has been set it indicates which
pass this Stream Profile corresponds to.
2.4 Example profile
-------------------
The representation used here is XML only as an example. No decision is
made as to which formatting to use for storing targets and profiles.
<gst-encoding-target>
<name>Nokia N900</name>
<category>Consumer Device</category>
<profiles>
<profile>Nokia N900/H264 HQ</profile>
<profile>Nokia N900/MP3</profile>
<profile>Nokia N900/AAC</profile>
</profiles>
</gst-encoding-target>
<gst-encoding-profile>
<name>Nokia N900/H264 HQ</name>
<description>
High Quality H264/AAC for the Nokia N900
</description>
<format>video/quicktime,variant=iso</format>
<streams>
<stream-profile>
<type>audio</type>
<format>audio/mpeg,mpegversion=4</format>
<preset>Quality High/Main</preset>
<restriction>audio/x-raw-int,channels=[1,2]</restriction>
<presence>1</presence>
</stream-profile>
<stream-profile>
<type>video</type>
<format>video/x-h264</format>
<preset>Profile Baseline/Quality High</preset>
<restriction>
video/x-raw-yuv,width=[16, 800],\
height=[16, 480],framerate=[1/1, 30000/1001]
</restriction>
<presence>1</presence>
</stream-profile>
</streams>
</gst-encoding-profile>
2.5 API
-------
A proposed C API is contained in the gstprofile.h file in this directory.
2.6 Modifications required in the existing GstPreset system
-----------------------------------------------------------
2.6.1. Temporary preset.
Currently a preset needs to be saved on disk in order to be
used.
This makes it impossible to have temporary presets (that exist only
during the lifetime of a process), which might be required in the
new proposed profile system
2.6.2 Categorisation of presets.
Currently presets are just aliases of a group of property/value
without any meanings or explanation as to how they exclude each
other.
Take for example the H264 encoder. It can have presets for:
* passes (1,2 or 3 passes)
* profiles (Baseline, Main, ...)
* quality (Low, medium, High)
In order to programmatically know which presets exclude each other,
we here propose the categorisation of these presets.
This can be done in one of two ways
1. in the name (by making the name be [<category>:]<name>)
This would give for example: "Quality:High", "Profile:Baseline"
2. by adding a new _meta key
This would give for example: _meta/category:quality
2.6.3 Aggregation of presets.
There can be more than one choice of presets to be done for an
element (quality, profile, pass).
This means that one can not currently describe the full
configuration of an element with a single string but with many.
The proposal here is to extend the GstPreset API to be able to set
all presets using one string and a well-known separator ('/').
This change only requires changes in the core preset handling code.
This would allow doing the following:
gst_preset_load_preset (h264enc,
"pass:1/profile:baseline/quality:high");
2.7 Points to be determined
---------------------------
This document hasn't determined yet how to solve the following
problems:
2.7.1 Storage of profiles
One proposal for storage would be to use a system wide directory
(like $prefix/share/gstreamer-0.10/profiles) and store XML files for
every individual profiles.
Users could then add their own profiles in ~/.gstreamer-0.10/profiles
This poses some limitations as to what to do if some applications
want to have some profiles limited to their own usage.
3. Helper library for profiles
------------------------------
These helper methods could also be added to existing libraries (like
GstPreset, GstPbUtils, ..).
The various API proposed are in the accompanying gstprofile.h file.
3.1 Getting user-readable names for formats
This is already provided by GstPbUtils.
3.2 Hierarchy of profiles
The goal is for applications to be able to present to the user a list
of combo-boxes for choosing their output profile:
[ Category ] # optional, depends on the application
[ Device/Site/.. ] # optional, depends on the application
[ Profile ]
Convenience methods are offered to easily get lists of categories,
devices, and profiles.
3.3 Creating Profiles
The goal is for applications to be able to easily create profiles.
The applications needs to be able to have a fast/efficient way to:
* select a container format and see all compatible streams he can use
with it.
* select a codec format and see which container formats he can use
with it.
The remaining parts concern the restrictions to encoder
input.
3.4 Ensuring availability of plugins for Profiles
When an application wishes to use a Profile, it should be able to
query whether it has all the needed plugins to use it.
This part will use GstPbUtils to query, and if needed install the
missing plugins through the installed distribution plugin installer.
* Research links
Some of these are still active documents, some other not
[0] GstPreset API documentation
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstPreset.html
[1] gnome-media GConf profiles
http://www.gnome.org/~bmsmith/gconf-docs/C/gnome-media.html
[2] Research on a Device Profile API
http://gstreamer.freedesktop.org/wiki/DeviceProfile
[3] Research on defining presets usage
http://gstreamer.freedesktop.org/wiki/PresetDesign

View file

@ -0,0 +1,46 @@
/* GStreamer encoding bin
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* (C) 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#ifndef __GST_ENCODEBIN_H__
#define __GST_ENCODEBIN_H__
#include <gst/gst.h>
#include <gst/gstprofile.h>
#define GST_TYPE_ENCODE_BIN (gst_encode_bin_get_type())
#define GST_ENCODE_BIN(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_ENCODE_BIN,GstPlayBin))
#define GST_ENCODE_BIN_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_ENCODE_BIN,GstPlayBinClass))
#define GST_IS_ENCODE_BIN(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_ENCODE_BIN))
#define GST_IS_ENCODE_BIN_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_ENCODE_BIN))
typedef struct _GstEncodebin GstEncodeBin;
struct _GstEncodeBin {
GstBin parent;
GstProfile *profile;
};
GType gst_encode_bin_get_type(void);
GstElement *gst_encode_bin_new (GstProfile *profile, gchar *name);
gboolean gst_encode_bin_set_profile (GstEncodeBin *ebin, GstProfile *profile);
#endif __GST_ENCODEBIN_H__

231
docs/design/gstprofile.h Normal file
View file

@ -0,0 +1,231 @@
/* GStreamer encoding profiles library
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* (C) 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#ifndef __GST_PROFILE_H__
#define __GST_PROFILE_H__
#include <gst/gst.h>
typedef enum {
GST_ENCODING_PROFILE_UNKNOWN,
GST_ENCODING_PROFILE_VIDEO,
GST_ENCODING_PROFILE_AUDIO,
GST_ENCODING_PROFILE_TEXT
/* Room for extenstion */
} GstEncodingProfileType;
typedef struct _GstEncodingTarget GstEncodingTarget;
typedef struct _GstEncodingProfile GstEncodingProfile;
typedef struct _GstStreamEncodingProfile GstStreamEncodingProfile;
typedef struct _GstVideoEncodingProfile GstVideoEncodingProfile;
/* FIXME/UNKNOWNS
*
* Should encoding categories be well-known strings/quarks ?
*
*/
/**
* GstEncodingTarget:
* @name: The name of the target profile.
* @category: The target category (device, service, use-case).
* @profiles: A list of #GstProfile this device supports.
*
*/
struct _GstEncodingTarget {
gchar *name;
gchar *category;
GList *profiles;
}
/**
* GstEncodingProfile:
* @name: The name of the profile
* @format: The GStreamer mime type corresponding to the muxing format.
* @preset: The name of the #GstPreset(s) to be used on the muxer. This is optional.
* @multipass: Whether this profile is a multi-pass profile or not.
* @encodingprofiles: A list of #GstStreamEncodingProfile for the various streams.
*
*/
struct _GstEncodingProfile {
gchar *name;
gchar *format;
gchar *preset;
gboolean multipass;
GList *encodingprofiles;
};
/**
* GstStreamEncodingProfile:
* @type: Type of profile
* @format: The GStreamer mime type corresponding to the encoding format.
* @preset: The name of the #GstPreset to be used on the encoder. This is optional.
* @restriction: The #GstCaps restricting the input. This is optional.
* @presence: The number of streams that can be created. 0 => any.
*/
struct _GstStreamEncodingProfile {
GstEncodingProfileType type;
gchar *format;
gchar *preset;
GstCaps *restriction;
guint presence;
};
/**
* GstVideoEncodingProfile:
* @profile: common #GstEncodingProfile part.
* @pass: The pass number if this is part of a multi-pass profile. Starts at 1
* for multi-pass. Set to 0 if this is not part of a multi-pass profile.
* @variable_framerate: Do not enforce framerate on incoming raw stream. Default
* is FALSE.
*/
struct _GstVideoEncodingProfile {
GstStreamEncodingProfile profile;
guint pass;
gboolean variable_framerate;
};
/* Generic helper API */
/**
* gst_encoding_category_list_target:
* @category: a profile target category name. Can be NULL.
*
* Returns the list of all available #GstProfileTarget for the given @category.
* If @category is #NULL, then all available #GstProfileTarget are returned.
*/
GList *gst_encoding_category_list_target (gchar *category);
/**
* list available profile target categories
*/
GList *gst_profile_list_target_categories ();
gboolean gst_profile_target_save (GstProfileTarget *target);
/**
* gst_encoding_profile_get_input_caps:
* @profile: a #GstEncodingProfile
*
* Returns: the list of all caps the profile can accept. Caller must call
* gst_cap_unref on all unwanted caps once it is done with the list.
*/
GList * gst_profile_get_input_caps (GstEncodingProfile *profile);
/*
* Application convenience methods (possibly to be added in gst-pb-utils)
*/
/**
* gst_pb_utils_create_encoder:
* @caps: The #GstCaps corresponding to a codec format
* @preset: The name of a preset
* @name: The name to give to the returned instance, can be #NULL.
*
* Creates an encoder which can output the given @caps. If several encoders can
* output the given @caps, then the one with the highest rank will be picked.
* If a @preset is specified, it will be applied to the created encoder before
* returning it.
* If a @preset is specified, then the highest-ranked encoder that can accept
* the givein preset will be returned.
*
* Returns: The encoder instance with the preset applied if it is available.
* #NULL if no encoder is available.
*/
GstElement *gst_pb_utils_create_encoder(GstCaps *caps, gchar *preset, gchar *name);
/**
* gst_pb_utils_create_encoder_format:
*
* Convenience version of @gst_pb_utils_create_encoder except one does not need
* to create a #GstCaps.
*/
GstElement *gst_pb_utils_create_encoder_format(gchar *format, gchar *preset,
gchar *name);
/**
* gst_pb_utils_create_muxer:
* @caps: The #GstCaps corresponding to a codec format
* @preset: The name of a preset
*
* Creates an muxer which can output the given @caps. If several muxers can
* output the given @caps, then the one with the highest rank will be picked.
* If a @preset is specified, it will be applied to the created muxer before
* returning it.
* If a @preset is specified, then the highest-ranked muxer that can accept
* the givein preset will be returned.
*
* Returns: The muxer instance with the preset applied if it is available.
* #NULL if no muxer is available.
*/
GstElement *gst_pb_utils_create_muxer(GstCaps *caps, gchar *preset);
/**
* gst_pb_utils_create_muxer_format:
*
* Convenience version of @gst_pb_utils_create_muxer except one does not need
* to create a #GstCaps.
*/
GstElement *gst_pb_utils_create_muxer_format(gchar *format, gchar *preset,
gchar *name);
/**
* gst_pb_utils_encoders_compatible_with_muxer:
* @muxer: a muxer instance
*
* Finds a list of available encoders whose output can be fed to the given
* @muxer.
*
* Returns: A list of compatible encoders, or #NULL if none can be found.
*/
GList *gst_pb_utils_encoders_compatible_with_muxer(GstElement *muxer);
GList *gst_pb_utils_muxers_compatible_with_encoder(GstElement *encoder);
/*
* GstPreset modifications
*/
/**
* gst_preset_create:
* @preset: The #GstPreset on which to create the preset
* @name: A name for the preset
* @properties: The properties
*
* Creates a new preset with the given properties. This preset will only
* exist during the lifetime of the process.
* If you wish to use it after the lifetime of the process, you must call
* @gst_preset_save_preset.
*
* Returns: #TRUE if the preset could be created, else #FALSE.
*/
gboolean gst_preset_create (GstPreset *preset, gchar *name,
GstStructure *properties);
/**
* gst_preset_reset:
* @preset: a #GstPreset
*
* Sets all the properties of the element back to their default values.
*/
/* FIXME : This could actually be put at the GstObject level, or maybe even
* at the GObject level. */
void gst_preset_reset (GstPreset *preset);
#endif /* __GST_PROFILE_H__ */

263
docs/design/metadata.txt Normal file
View file

@ -0,0 +1,263 @@
Metadata
~~~~~~~~
Summary
~~~~~~~
1. Basic ideas
2. Problems
3. Ways of solving problems
4. Use-cases
5. API draft
1. Basic ideas
~~~~~~~~~~~~~~
If we look at entities that are present in GES we can see that almost all of
them need some sort of metadata:
* GESTimeline
* GESTimelineLayer
* GESTimelineObject
* GESTrackObject
* Yet to be implemented GESProject
For all those classes to be able to contain metadatas and to avoid code
duplication as much as possible, we should have an interface to handle Metadata.
Let's call the interface GESMetaContainer for now (name to be defined).
2. Problems
~~~~~~~~~~~
1) We must be able to discover all metadata items that are
attached to object
2) We must be able to hold metadata of any type user wants
3) Some metadatas are read only, others are writable
4) User should be able to query metadata easily using various criteria
5) Metadatas should be serializable
6) User should be able to define read only metadatas with a default value
7) User should be able to define metadatas that have a specific type which can not
be changed when setting a new value
3. Possible solution
~~~~~~~~~~~~~~~~~~~~~
1) To implement metadata GstStructure will be used. It allows to get list of
all available tags in specified list by calling "gst_structure_foreach".
2) We will have methods to register metas
4. Use-cases
~~~~~~~~~~~~
UC-1. Hold tag information about file source asset.
- TS: I think some of them are TrackObject specific... so we should be
able to get them from the 2 types of objects
UC-2. Hold descriptions of operations
UC-3. Hold information about projects (title, author, description)
UC-4. Hold user comments about any of TimelineLayer/Timeline/Project/TimelineObjects
UC-5. Hold application specific settings (i.e. layer height, folding state
in Pitivi)
UC-6. Serialize a timeline, project and keep metadatas
5. API
~~~~~~
We have a GESMetdata class that controls metadata.
gboolean
ges_meta_container_set_boolean (GESMetaContainer *container,
const gchar* meta_item,
gboolean value);
gboolean
ges_meta_container_set_int (GESMetaContainer *container,
const gchar* meta_item,
gint value);
gboolean
ges_meta_container_set_uint (GESMetaContainer *container,
const gchar* meta_item,
guint value);
gboolean
ges_meta_container_set_int64 (GESMetaContainer *container,
const gchar* meta_item,
gint64 value);
gboolean
ges_meta_container_set_uint64 (GESMetaContainer *container,
const gchar* meta_item,
guint64 value);
gboolean
ges_meta_container_set_float (GESMetaContainer *container,
const gchar* meta_item,
gfloat value);
gboolean
ges_meta_container_set_double (GESMetaContainer *container,
const gchar* meta_item,
gdouble value);
gboolean
ges_meta_container_set_date (GESMetaContainer *container,
const gchar* meta_item,
const GDate* value);
gboolean
ges_meta_container_set_date_time (GESMetaContainer *container,
const gchar* meta_item,
const GstDateTime* value);
gboolean
ges_meta_container_set_string (GESMetaContainer *container,
const gchar* meta_item,
const gchar* value);
gboolean
ges_meta_container_set_meta (GESMetaContainer * container,
const gchar* meta_item,
const GValue *value);
gboolean
ges_meta_container_register_meta_boolean (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
gboolean value);
gboolean
ges_meta_container_register_meta_int (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
gint value);
gboolean
ges_meta_container_register_meta_uint (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
guint value);
gboolean
ges_meta_container_register_meta_int64 (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
gint64 value);
gboolean
ges_meta_container_register_meta_uint64 (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
guint64 value);
gboolean
ges_meta_container_register_meta_float (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
gfloat value);
gboolean
ges_meta_container_register_meta_double (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
gdouble value);
gboolean
ges_meta_container_register_meta_date (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
const GDate* value);
gboolean
ges_meta_container_register_meta_date_time (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
const GstDateTime* value);
gboolean
ges_meta_container_register_meta_string (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
const gchar* value);
gboolean
ges_meta_container_register_meta (GESMetaContainer *container,
GESMetaFlag flags,
const gchar* meta_item,
const GValue * value);
gboolean
ges_meta_container_check_meta_registered (GESMetaContainer *container,
const gchar * meta_item,
GESMetaFlag * flags,
GType * type);
gboolean
ges_meta_container_get_boolean (GESMetaContainer *container,
const gchar* meta_item,
gboolean* dest);
gboolean
ges_meta_container_get_int (GESMetaContainer *container,
const gchar* meta_item,
gint* dest);
gboolean
ges_meta_container_get_uint (GESMetaContainer *container,
const gchar* meta_item,
guint* dest);
gboolean
ges_meta_container_get_int64 (GESMetaContainer *container,
const gchar* meta_item,
gint64* dest);
gboolean
ges_meta_container_get_uint64 (GESMetaContainer *container,
const gchar* meta_item,
guint64* dest);
gboolean
ges_meta_container_get_float (GESMetaContainer *container,
const gchar* meta_item,
gfloat* dest);
gboolean
ges_meta_container_get_double (GESMetaContainer *container,
const gchar* meta_item,
gdouble* dest);
gboolean
ges_meta_container_get_date (GESMetaContainer *container,
const gchar* meta_item,
GDate** dest);
gboolean
ges_meta_container_get_date_time (GESMetaContainer *container,
const gchar* meta_item,
GstDateTime** dest);
const gchar *
ges_meta_container_get_string (GESMetaContainer * container,
const gchar * meta_item);
const GValue *
ges_meta_container_get_meta (GESMetaContainer * container,
const gchar * key);
typedef void
(*GESMetaForeachFunc) (const GESMetaContainer *container,
const gchar *key,
const GValue *value,
gpointer user_data);
void
ges_meta_container_foreach (GESMetaContainer *container,
GESMetaForeachFunc func,
gpointer user_data);
gchar *
ges_meta_container_metas_to_string (GESMetaContainer *container);
gboolean
ges_meta_container_add_metas_from_string (GESMetaContainer *container,
const gchar *str);

412
docs/design/time_notes.md Normal file
View file

@ -0,0 +1,412 @@
# Time Notes
Some notes on time coordinates and time effects in GES.
## Time Coordinate Definitions
A timeline will have a single time coordinate, which runs from `0` onwards in `GstClockTime`. Each track in the timeline will share the same time.
For a given track, at any given timeline time `time`, we have a stack of `GESTrackElement`s whose interval `[start, start + duration]` contains `time`. The elements are linked in order or priority. Each element will have four time coordinates per *each* unique stack it is part of:
+ external sink coordinates: the coordinates used at the boundary between the upstream element and itself. This is the external source coordinates of the upstream element minus `(downstream-start - upstream-start)`. If it has no upstream element, these coordinates do not exist.
+ external source coordinates: the coordinates used at the boundary between the downstream element and itself. This is the external sink coordinates of the downstream element minus `(upstream-start - downstream-start)`. If it has no downstream element, these coordinates can be translated to the timeline coordinates by adding the `start` of the element.
+ internal sink coordinates: the coordinates used for the sink of the first internal `GstElement`. This is the external sink coordinates plus `in-point`.
+ internal source coordinates: the coordinates used at the source of the last internal `GstElement`. This will differ from the internal sink coordinates if one of the `GstElement`s applies a rate-changing effects. This is the external source coordinates plus `in-point`. Note that an element that changes the consumption rate should always have its in-point set to `0`. This is because nleghostpad is not able to 'undo' this shift by `in-point` at the opposite pad.
The following diagram shows where these coordinates are used, and how they are transformed. Below we have a `GESSource`, followed by a `GESOperation` that does not perform any rate-changing effect, followed by a `GESEffect` that does apply a rate-changing effect (and so its `in-point` is `0`).
```
time coordinate coordinate
coords object transformation transformation
used upwards downwards
______________________________________________________________________________________
| source (1) |
int. src |---------------------------|
| | + in-point-1 - in-point-1
ex. src '==========================='
- start-1 + start-2 + start-1 - start-2
ex. sink .===========================.
| | - in-point-2 + in-point-2
int. sink |---------------------------|
| operation (2) | identity () identity ()
int. src |---------------------------|
| | + in-point-2 - in-point-2
ex. src '==========================='
- start-2 + start-3 + start-2 - start-3
ex. sink .===========================.
| | - 0 + 0
int. sink |---------------------------|
| time effect (3) | f () f^-1 ()
int. src |---------------------------|
| | + 0 - 0
ex. src '==========================='
- start-3 + start-3
timeline +++++++++++++++++++++++++++++
timeline
```
The given function `f` summarises how a seek will transform as it goes from from the source to the sink of the internal `GstElement`, and `f^-1` summarises how a segment stream time will transform.
In particular, `f` will be a function
```
f: [0, MAX] -> [0, G_MAXUINT64 - 1]
```
where `MAX` is some `guint64`. For what follows, we will only fully support time effects whose function `f`:
+ is monotonically increasing. This would exclude time effects that play some later content, and then jump back to earlier content.
+ is 'continuously reversible'. We define `T_d` for the time `t` as the set of all the times `t'` such that `|f (t') - t| <= d`. This property requires that, for any `t` between `f`'s minimum and maximum values, we can choose a small `d` such that `T_d` is not empty, is small and has no gaps. The word "small" refers to an unnoticeable difference (the times are in nanoseconds). This means that `f` can be approximately reversed at all points between its minimum and maximum, which means that `f^-1` can act as a close inverse of `f`. For a monotonically increasing function, this means that `f` is *steadily* increasing.
For example, if `f` simply doubles the time, then for time `t = 501`, we can choose `d=1`, and `T_d` would be `{250, 251}`.
This would exclude a time effect which has a large jump, because there would be a time `t` between this jump, whose `T_d` would be empty for all small `d`.
This would also exclude a time effect that creates a freeze-frame effect by always seeking to the same spot, because at the time `t` of this freeze-frame, `T_d` would be large for all `d`.
+ obeys `f (0) = 0`. This would exclude a time effect that introduces an initial shift in the requested source time.
+ has a `MAX` that is large enough. For example, 24 hours would be fine for a timeline. This would exclude a rate effect with a very large speedup.
+ does not depend on any property outside of the effect element, or on the data it receives. This would exclude a time effect that, say, goes faster if there is more red in the image.
In what follows, a time effect that breaks one of these can still be used, but not all the features will work.
### Translations Handled by `nleobject` Pads
An `nleobject` source pad will translate outgoing times by applying
```
time-out = time-in + start - in-point
```
This will translate from the internal source coordinates to the timeline coordinates *if* it is the most downstream element. Similarly, an `nleobject` sink pad will translate incoming times by applying
```
time-out = time-in - start + in-point
```
If we have two `nle-object`s, `object-up` and `object-down`, that have their pads linked, then a time `time-up` from `object-up`'s internal `GstElement`, would be translated at the link to
```
time-down
= time-up + start-up - in-point-up - start-down + in-point-down
```
So the pads will overall translate from the internal source coordinates of the upstream element to the internal sink coordinates of the downstream element.
### Undefined Translations
Note that the coordinate transformation from the timeline time to an upstream time may be undefined depending on the configuration of elements in the timeline. For example, consider the earlier example stack, with the operation starting later than the time effect, such that
```
d = (start-2 - start-3) > 0
```
And we choose the `time`
```
time = start-2
= start-3 + d
```
Then, when `time` is transformed to the external source coordinates of the operation, we have
```
operation-source-time = f (time - start-3) - start-2 + start-3
= f (d) - d
```
If the time effect slows down the consumption rate, then `f (d) < d`, which would make the time undefined in the external source coordinates (we can not have a negative `GstClockTime`). Basically, the effect is trying to access content that is before the operation.
We can similarly have an effect that tries to access content that is later than the operation, but this wouldn't lead to an underflow of the time. It can however lead to a request for data that is outside the internal content of the operation.
### Mismatched Coordinates
The coordinates of an element are only defined relative to the stack that they are in. However, if we have no time effects, these coordinates will line up. Consider the following source and operation configuration.
```
| source (1) |
|---------------------------|
| |
'==========================='
.===========================.
| |
|---------------------------|
| operation (2) |
|---------------------------|
| |
'==========================='
+++++++++++++++++++++++++++++++++++++++++
timeline
```
This gives us three stacks
```
| (1) | | (1) |
|---------------| |-----------|
| | | |
'===============. '==========='
0 (s1-s2+d1) (s1-s2+d1) d2
0 (s2-s1) (s2-s1) d1
.===========. .===============.
| | | |
|-----------| |---------------|
| (2) | | (2) |
|-----------| |---------------|
| | | |
'===========' '==============='
0 (s2-s1) (s2-s1) d1
+++++++++++++ +++++++++++++++++ +++++++++++++
s1 s2 s2 (s1+d1) (s1+d1) (s2+d2)
```
where we have written in times in the external coordinates of the elements, where `s1` and `d1` are the `start` and `duration` of the source, and similarly for `s2` and `d2` for the operation. We can see that the edge times of all coordinates match up with their neighbours. Therefore, for both elements, there coordinates across each stack can be combined into a single coordinate system.
Consider that instead of the operation we have a time effect, then we would have
```
| (1) | | (1) |
|---------------| |-----------|
| | | |
'===============. '==========='
f(s2-s1) f(d1) (s1-s2+d1) d2
-(s2-s1) -(s2-s1)
f(0) f(s2-s1) f(s2-s1) f(d1)
.===========. .===============.
| | | |
|-----------| |---------------|
| (2) | | (2) |
|-----------| |---------------|
| | | |
'===========' '==============='
0 (s2-s1) (s2-s1) d1
+++++++++++++ +++++++++++++++++ +++++++++++++
s1 s2 s2 (s1+d1) (s1+d1) (s2+d2)
```
We can see that the coordinates of the source now start at `f(s2-s1) - (s2-s1)`, rather than `0`. We can also see that the external source coordinates of the source jump by `(d1 - f(d1))` when the time effect ends. Therefore, most time effects will prevent the coordinates from different stacks from being combined. This can lead to counter-intuitive behaviour.
A further example would be a rate effect with `rate=3` that covers two sources that are side by side. The rate effect will **not** treat this as playing the sources concatenated, at triple speed. Instead, it would play the first source at triple speed, and once it reaches the starting timeline time of the second source, it will start playing the second source instead, but starting from the internal source coordinates
```
3 * (source-start - rate-start) - (source-start - rate-start) + source-in-point
= 2 * (source-start - rate-start) + source-in-point
```
Note that if this was a slowed down rate, this would have been an undefined (negative) time, as we mentioned earlier.
Therefore, in general, time effects should only be placed at a higher priority than elements that share the same `start` and `duration` as it. Note that it is fine to place an operation with a higher priority on top of a time effect with a different `start` or `duration` because this will not lead to a change in the coordinates.
This is why only a `GESSourceClip` can have time effects added to it.
There is a general exception to this: if a time effect obeys `f (0) = 0`, then it will not introduce mismatched coordinates downstream if it has a later `start` than all the elements it has a higher priority than, **and** its end timeline time matches all of theirs. Note that this is because the effect would only exist in a single stack, and starts by apply no change to the times it receives.
## GESTimelineElement times
The `start` and `duration` of an element use the timeline time coordinates. `in-point` and `max-duration` use the internal source coordinates. These last two should be `0` and `GST_CLOCK_TIME_NONE` respectively for time effects.
## How to Translate Between Time Coordinates of a Clip
Consider a `GESTrackElement` `element` in a `GESClip` `clip` in a timeline. It has `n` `active` elements with higher priority in the same `clip` and track, labelled by `i=1,...,n`, where element 1 has a higher priority than element 2, and so on. Each element has an associated function `f_i` that translates from its external source coordinates to its external sink coordinates. Note that for elements that apply no time effect, this will be an identity, regardless of their `in-point`. We can define the function `F`, such that
```
F(t) = f_n (f_n-1 ( ... f_1 (t)...))
```
Note that if each `f_i` has the desired properties, then so will `F`, with the exception that the maximum value it can translate may have become too small. For example, if several rate effects accumulate into a very large speedup.
Given such an `F`, we can translate from the timeline time `t` to the internal source coordinate time of `element` using
```
F (t - start) + in-point
```
This is what is done in `ges_clip_get_internal_time_from_timeline_time`.
Note that this works because all the elements in `clip` share the same `start`. Note that this would not work if there existed an overlapping higher priority time effect outside of the clip because the highest priority clip element would **not** be receiving a timeline time at its source pads. This is not a problem if there are non-time effects at higher priority because they will pass through a timeline time unchanged.
If `F` has the desired properties, it will have a well defined inverse `F^-1`, based on the inverses of `f_i`, which we can use to reverse this translation:
```
F^-1 (t - in-point) + start
```
This is what is done in `ges_clip_get_timeline_time_from_internal_time`.
## `duration-limit`
The `duration-limit` is meant to be the largest value we can set the clip's `duration` to.
It would be given by the minimum
```
ges_clip_get_timeline_time_from_internal_time (clip, child, child-max-duration) - start
```
we calculate amongst all its children that have a `max-duration`. Note that the implementation of `_calculate_duration_limit` does not use this method directly, but it should give the same result.
Note that this would fail if `max-duration` is not reachable through a seek. E.g. if the corresponding function `F` of the time effects acted like
```
F (t) = t + max-duration + 1
```
then `F^-1 (t)` will be undefined for `t=max-duartion` because its domain will be `[max-duration + 1, inf)`. Note that this function `F` does not obey `F (0)=0`, so is not supported in GES.
Note that `duration-limit` may not be *exactly* the largest end time possible. If the corresponding function `F` is monotonically increasing, then there is no source time below `max-duration` that could give a larger value, but there may be some times beyond `max-time` that would correspond to the *same* source time. However, these extra times will only differ from the `max-time` by a small amount if `F` is 'continuously reversible', and so `max-time` would be close enough. Otherwise, we would not have a simple way to know which is the actual largest `duration`.
## Trimming a clip
Normally, trimming is meant to keep the internal content in the same position relative to the timeline. If we are applying a non-constant rate effect, it may not be possible to keep all the internal content appearing in the timeline at the same time whilst changing the `start` and `duration`. However, we can keep the start or end frames/samples in the same timeline position.
#### Trimming the start of a clip to a later time
When trimming the start edge of a clip from timeline time `old-start` to `new-start`, where `old-start < new-start <= (old-start + duration)`, we set the `in-point` of the clip's children such that the internal content that appeared at `new-start` before the trim, still appears at `new-start` afterwards.
This would require
```
new-in-point = old-in-point + F (new-start - old-start)
```
because this is the internal source time corresponding to `new-start`.
Note that, after we have finished trimming, *assuming* the corresponding `F` has not changed and `F (0) = 0`,
```
ges_clip_get_internal_timeline_from_timeline_time (clip, child, new-start)
= F (new-start - new-start) + new-in-point
= new-in-point
```
So after trimming, `new-start` will correspond to the same source position as before. Note that this would not work if the time effects changed depending on the data they receive (such as a "go faster if we have more red" time effect) because the corresponding `F` would have changed after setting the `in-point`. However, we already stated earlier that these are not supported in GES.
#### Trimming the start of a clip to an earlier time
When trimming the start edge of a clip from timeline time `old-start` to `new-start`, where `new-start < old-start`, we set the `in-point` of the clip's children such that the internal content that appeared at `old-start` before the trim, still appears at `old-start` afterwards.
```
new-in-point = old-in-point - F (old-start - new-start)
```
Note that this will fail if the second argument is too big, which indicates that it would be before there is any internal content.
In terms of the function `F` earlier, since this is calculated using the new `start` and old `in-point`, the `source-time` would be
Note, after we have finished trimming, *assuming* the corresponding `F` has not changed,
```
ges_clip_get_internal_time_from_timeline_time (clip, child, old-start)
= F (old-start - new-start) + new-in-point
= F (old-start - new-start) + old-in-point - F (old-start - new-start)
= old-in-point
```
So after trimming, `old-start` will correspond to the same source time as before.
Note that `ges_clip_get_internal_time_from_timeline_time` will perform this same calculation if it receives a timeline time before the `start` of the clip. So timeline-tree is simply able to call `ges_clip_get_internal_time_from_timeline_time (clip, child, new_start, error)` in both cases.
#### Trimming the end of a clip
This is as simple as changing the `duration` of the clip since everything will stay at the same timeline position anyway (assuming `F` does not change, as required by GES). It just cannot go above the clip's `duration-limit`.
## Splitting a clip
The `in-point` of the new clip is chosen to match the new `out-point` of the split clip. This won't work well if different core children of the clip will end up with very different `out-point`s. But if these differences are within half a source frame, GES will not complain. The same can happen when trimming a clip, since all the core children must share the same `in-point`.
## Buffer Timestamps
NOTE: As of 21 May, the recommended changes are not implemented in GES. This delves into why translations will be needed for non-linear time effects.
Currently, the `nleobject` pads will leave the buffer times unchanged. Which means that an `nlesource` will send out buffer timestamped using its *internal* source coordinates.
This is in contrast to a `pitch` within an `nleoperation`, which would translate the buffer times from its internal sink coordinates to its internal source coordinates.
Since the internal source coordinates of the `nlesource` do **not** match the internal sink coordinates of the `nleoperation` (they will differ by `in-point`), this will result in buffer times that are not in **any** coordinates.
This will make it difficult to use control bindings which are to be given in stream time, which is linked to the buffer timestamps.
We can explore this in more detail. [According to the GStreamer design docs](https://gstreamer.freedesktop.org/documentation/additional/design/synchronisation.html#stream-time), the stream time is used for
+ report the POSITION query in the pipeline
+ the position used in seek events/queries
+ the position used to synchronize controller values
Therefore, in our case, we can say that the stream time at a given position in an nle stack should match the corresponding seek time.
If we have no applied rate, which shouldn't be the case for a normal uses of a timeline, the `stream-time` is given by
```
stream-time = buffer.timestamp - seg.start + seg.time
```
Thus, the stream time is basically the internal source or sink coordinates. In `GESTrackElement` control sources are meant to be given in the internal source coordinates.
We will now look at what these time values **currently** are set to in a stack of an `nlesource` and an `nleoperation` that share the same `start` and `duration`.
Currently, the `nleobject` pads will only change the `seg.time` of the segments it receives by adding or subtracting `(start - in-point)`.
We will assume that the `GstElement` that the `nleoperation` wraps is applying its time effect to `seg.time`, `seg.start` and `buffer.timestamp`, which is given by the function `g`. Note that this is what `pitch` currently does. Its not clear to me what `videorate` does to the `buffer.timestamp`, but it does transform `seg.start` the same way as `seg.time`.
The following is a table of what the `seg.time`, `seg.start` and `buffer.timestamp` values are when *leaving* a pad. The "internal src pad" refers to the source pad of the internal `GstElement`. `s` is the `start` of the objects, and `i` is the `in-point` of the `nlesource`. Following these is what the corresponding stream time would be using these values. The final row is what the corresponding seek position would be coming *into* the pad, if were seeking to the same media time `T`.
```
nlesrc nlesrc nleop nleop nleop
internal external external internal external
src pad src pad sink pad src pad src pad
seg.time i s 0 g (0) g (0) + s
seg.start i i i g (i) g (i)
buffer. T T T g (T) g (T)
timestamp
------------------------------------------------------------------------
stream T T T g (T) g (T)
time - i - i - g (i) - g (i)
+ s + g (0) + g (0)
+ s
------------------------------------------------------------------------
seek T T T g (T - i) g (T - i)
time - i - i + s
+ s
```
We can see that after the `nleoperation`, the seek time and stream time will generally be out of sync.
Note that if `g` corresponds to a constant rate effect, then
```
g (t) = r * t
```
for some rate `r`. Then, at the `nleoperation` external source pad.
```
stream-time = r * T - r * i + r * 0 + s
= g (T - i) + s
= seek-time
```
so the two will match up under the current behaviour for this special case. However, if the rate varies, this will break down.
If, instead, we translate `seg.start` and `buffer.timestamp` in the same way as `seg.time` on the `nleobject` pads, by adding or subtracting `(start - in-point)`, then we will always have
```
seg.start = seg.time
```
which means that we would also have
```
seek-time = stream-time = buffer.timestamp
```
Finally, it would be a good if the convention for a time effect was to use the *output* stream time in `gst_object_sync_values`, rather than the *input* stream time. This would make them compatible with GES's rule that control sources are given in the internal source coordinates. Luckily, it seems that `pitch` already uses the output stream time. `videorate` doesn't currently use `gst_object_sync_values`.

473
docs/gst_plugins_cache.json Normal file
View file

@ -0,0 +1,473 @@
{
"ges": {
"description": "GStreamer Editing Services Plugin",
"elements": {
"gesdemux": {
"author": "Thibault Saunier <tsaunier@igalia.com",
"description": "Demuxer for complex timeline file formats using GES.",
"hierarchy": [
"GESDemux",
"GESBaseBin",
"GstBin",
"GstElement",
"GstObject",
"GInitiallyUnowned",
"GObject"
],
"interfaces": [
"GstChildProxy"
],
"klass": "Codec/Demux/Editing",
"long-name": "GStreamer Editing Services based 'demuxer'",
"pad-templates": {
"audio_src": {
"caps": "audio/x-raw(ANY):\n",
"direction": "src",
"presence": "sometimes"
},
"sink": {
"caps": "application/xges:\ntext/x-xptv:\napplication/vnd.pixar.opentimelineio+json:\napplication/vnd.apple-xmeml+xml:\napplication/vnd.apple-fcp+xml:\n",
"direction": "sink",
"presence": "always"
},
"video_src": {
"caps": "video/x-raw(ANY):\n",
"direction": "src",
"presence": "sometimes"
}
},
"properties": {},
"rank": "primary",
"signals": {}
},
"gessrc": {
"author": "Thibault Saunier <tsaunier@igalia.com",
"description": "Source for GESTimeline.",
"hierarchy": [
"GESSrc",
"GESBaseBin",
"GstBin",
"GstElement",
"GstObject",
"GInitiallyUnowned",
"GObject"
],
"interfaces": [
"GstChildProxy",
"GstURIHandler"
],
"klass": "Codec/Source/Editing",
"long-name": "GStreamer Editing Services based 'source'",
"pad-templates": {
"audio_src": {
"caps": "audio/x-raw(ANY):\n",
"direction": "src",
"presence": "sometimes"
},
"video_src": {
"caps": "video/x-raw(ANY):\n",
"direction": "src",
"presence": "sometimes"
}
},
"properties": {},
"rank": "none",
"signals": {}
}
},
"filename": "gstges",
"license": "LGPL",
"other-types": {
"GESBaseBin": {
"hierarchy": [
"GESBaseBin",
"GstBin",
"GstElement",
"GstObject",
"GInitiallyUnowned",
"GObject"
],
"interfaces": [
"GstChildProxy"
],
"kind": "object",
"properties": {
"timeline": {
"blurb": "Timeline to use in this src.",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"mutable": "null",
"readable": true,
"type": "GESTimeline",
"writable": true
}
}
}
},
"package": "GStreamer Editing Services",
"source": "gst-editing-services",
"tracers": {},
"url": "Unknown package origin"
},
"nle": {
"description": "GStreamer Non Linear Engine",
"elements": {
"nlecomposition": {
"author": "Wim Taymans <wim.taymans@gmail.com>, Edward Hervey <bilboed@bilboed.com>, Mathieu Duponchelle <mathieu.duponchelle@opencreed.com>, Thibault Saunier <tsaunier@gnome.org>",
"description": "Combines NLE objects",
"hierarchy": [
"NleComposition",
"NleObject",
"GstBin",
"GstElement",
"GstObject",
"GInitiallyUnowned",
"GObject"
],
"interfaces": [
"GstChildProxy"
],
"klass": "Filter/Editor",
"long-name": "GNonLin Composition",
"pad-templates": {
"src": {
"caps": "ANY",
"direction": "src",
"presence": "always"
}
},
"properties": {
"drop-tags": {
"blurb": "Whether the composition should drop tags from its children",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "true",
"mutable": "playing",
"readable": true,
"type": "gboolean",
"writable": true
},
"id": {
"blurb": "The stream-id of the composition",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "NULL",
"mutable": "null",
"readable": true,
"type": "gchararray",
"writable": true
}
},
"rank": "none",
"signals": {
"commited": {
"args": [
{
"name": "arg0",
"type": "gboolean"
}
],
"return-type": "void",
"when": "first"
}
}
},
"nleoperation": {
"author": "Wim Taymans <wim.taymans@gmail.com>, Edward Hervey <bilboed@bilboed.com>",
"description": "Encapsulates filters/effects for use with NLE Objects",
"hierarchy": [
"NleOperation",
"NleObject",
"GstBin",
"GstElement",
"GstObject",
"GInitiallyUnowned",
"GObject"
],
"interfaces": [
"GstChildProxy"
],
"klass": "Filter/Editor",
"long-name": "GNonLin Operation",
"pad-templates": {
"sink%%d": {
"caps": "ANY",
"direction": "sink",
"presence": "request"
},
"src": {
"caps": "ANY",
"direction": "src",
"presence": "always"
}
},
"properties": {
"sinks": {
"blurb": "Number of input sinks (-1 for automatic handling)",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "1",
"max": "2147483647",
"min": "-1",
"mutable": "null",
"readable": true,
"type": "gint",
"writable": true
}
},
"rank": "none",
"signals": {
"input-priority-changed": {
"args": [
{
"name": "arg0",
"type": "GstPad"
},
{
"name": "arg1",
"type": "guint"
}
],
"return-type": "void",
"when": "last"
}
}
},
"nlesource": {
"author": "Wim Taymans <wim.taymans@gmail.com>, Edward Hervey <bilboed@bilboed.com>",
"description": "Manages source elements",
"hierarchy": [
"NleSource",
"NleObject",
"GstBin",
"GstElement",
"GstObject",
"GInitiallyUnowned",
"GObject"
],
"interfaces": [
"GstChildProxy"
],
"klass": "Filter/Editor",
"long-name": "GNonLin Source",
"pad-templates": {
"src": {
"caps": "ANY",
"direction": "src",
"presence": "always"
}
},
"properties": {},
"rank": "none",
"signals": {}
},
"nleurisource": {
"author": "Edward Hervey <bilboed@bilboed.com>",
"description": "High-level URI Source element",
"hierarchy": [
"NleURISource",
"NleSource",
"NleObject",
"GstBin",
"GstElement",
"GstObject",
"GInitiallyUnowned",
"GObject"
],
"interfaces": [
"GstChildProxy"
],
"klass": "Filter/Editor",
"long-name": "GNonLin URI Source",
"pad-templates": {
"src": {
"caps": "ANY",
"direction": "src",
"presence": "sometimes"
}
},
"properties": {
"uri": {
"blurb": "Uri of the file to use",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "NULL",
"mutable": "null",
"readable": true,
"type": "gchararray",
"writable": true
}
},
"rank": "none",
"signals": {}
}
},
"filename": "gstnle",
"license": "LGPL",
"other-types": {
"NleObject": {
"hierarchy": [
"NleObject",
"GstBin",
"GstElement",
"GstObject",
"GInitiallyUnowned",
"GObject"
],
"interfaces": [
"GstChildProxy"
],
"kind": "object",
"properties": {
"active": {
"blurb": "Use this object in the NleComposition",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "true",
"mutable": "null",
"readable": true,
"type": "gboolean",
"writable": true
},
"caps": {
"blurb": "Caps used to filter/choose the output stream",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "ANY",
"mutable": "null",
"readable": true,
"type": "GstCaps",
"writable": true
},
"duration": {
"blurb": "Outgoing duration (in nanoseconds)",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "0",
"max": "9223372036854775807",
"min": "0",
"mutable": "null",
"readable": true,
"type": "gint64",
"writable": true
},
"expandable": {
"blurb": "Expand to the full duration of the container composition",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "false",
"mutable": "null",
"readable": true,
"type": "gboolean",
"writable": true
},
"inpoint": {
"blurb": "The media start position (in nanoseconds)",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "18446744073709551615",
"max": "18446744073709551615",
"min": "0",
"mutable": "null",
"readable": true,
"type": "guint64",
"writable": true
},
"media-duration-factor": {
"blurb": "The relative rate caused by this object",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "1",
"max": "1.79769e+308",
"min": "0.01",
"mutable": "null",
"readable": true,
"type": "gdouble",
"writable": true
},
"priority": {
"blurb": "The priority of the object (0 = highest priority)",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "0",
"max": "-1",
"min": "0",
"mutable": "null",
"readable": true,
"type": "guint",
"writable": true
},
"start": {
"blurb": "The start position relative to the parent (in nanoseconds)",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "0",
"max": "18446744073709551615",
"min": "0",
"mutable": "null",
"readable": true,
"type": "guint64",
"writable": true
},
"stop": {
"blurb": "The stop position relative to the parent (in nanoseconds)",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "0",
"max": "18446744073709551615",
"min": "0",
"mutable": "null",
"readable": true,
"type": "guint64",
"writable": false
}
},
"signals": {
"commit": {
"action": true,
"args": [
{
"name": "arg0",
"type": "gboolean"
}
],
"return-type": "gboolean",
"when": "last"
}
}
}
},
"package": "GStreamer Editing Services",
"source": "gst-editing-services",
"tracers": {},
"url": "Unknown package origin"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

68
docs/index.md Normal file
View file

@ -0,0 +1,68 @@
---
short-description: GStreamer Editing Services API reference.
...
# GStreamer Editing Services
The "GStreamer Editing Services" is a library to simplify the creation
of multimedia editing applications. Based on the GStreamer multimedia framework
and the GNonLin set of plugins, its goals are to suit all types of editing-related
applications.
The GStreamer Editing Services are cross-platform and work on most UNIX-like
platform as well as Windows. It is released under the GNU Library General Public License
(GNU LGPL).
## Goals of GStreamer Editing Services
The GStreamer multimedia framework and the accompanying GNonLin set of
plugins for non-linear editing offer all the building blocks for:
- Decoding and encoding to a wide variety of formats, through all the
available GStreamer plugins.
- Easily choosing segments of streams and arranging them through time
through the GNonLin set of plugins.
But all those building blocks only offer stream-level access, which
results in developers who want to write non-linear editors to write a
consequent amount of code to get to the level of *non-linear editing*
notions which are closer and more meaningful for the end-user (and
therefore the application).
The GStreamer Editing Services (hereafter GES) aims to fill the gap
between GStreamer/GNonLin and the application developer by offering a
series of classes to simplify the creation of many kind of
editing-related applications.
## Architecture
### Timeline and TimelinePipeline
The most top-level object encapsulating every other object is the
[GESTimeline](GESTimeline). It is the central object for any editing project.
The `GESTimeline` is a `GstElement`. It can therefore be used in any
GStreamer pipeline like any other object.
### Tracks and Layers
The GESTimeline can contain two types of objects (seen in
"Layers and Tracks"):
- Layers - Corresponds to the user-visible arrangement of clips, and
what you primarily interact with as an application developer. A
minimalistic timeline would only have one layer, but a more complex
editing application could use as many as needed.
- Tracks - Corresponds to the output streams in GStreamer. A typical
GESTimeline, aimed at a video editing application, would have an
audio track and a video track. A GESTimeline for an audio editing
application would only require an audio track. Multiple layers can
be related to each track.
![Layers and Tracks](images/layer_track_overview.png)
In order to reduce even more the amount of GStreamer interaction the
application developer has to deal with, a convenience GstPipeline has
been made available specifically for Timelines : [GESPipeline](GESPipeline).

View file

@ -0,0 +1,32 @@
#### `freq`
Frequency of test signal. The sample rate needs to be at least 2 times higher.
Value type: #gdouble
See #audiotestsrc:freq
#### `mute`
mute channel
Value type: #gboolean
See #volume:mute
#### `volume`
volume factor, 1.0=100%
Value type: #gdouble
See #volume:volume
#### `volume`
Volume of test signal
Value type: #gdouble
See #audiotestsrc:volume

View file

@ -0,0 +1,16 @@
#### `mute`
mute channel
Value type: #gboolean
See #volume:mute
#### `volume`
volume factor, 1.0=100%
Value type: #gdouble
See #volume:volume

View file

@ -0,0 +1,201 @@
#### `alpha`
alpha of the stream
Value type: #gdouble
#### `background-color`
Background color to use (big-endian ARGB)
Value type: #guint
See #videotestsrc:background-color
#### `font-desc`
Pango font description of font to be used for rendering. See documentation of
pango_font_description_from_string for syntax.
Value type: #gchararray
See #GstBaseTextOverlay:font-desc
#### `foreground-color`
Foreground color to use (big-endian ARGB)
Value type: #guint
See #videotestsrc:foreground-color
#### `freq`
Frequency of test signal. The sample rate needs to be at least 2 times higher.
Value type: #gdouble
See #audiotestsrc:freq
#### `halignment`
Horizontal alignment of the text
Valid values:
- **left** (0) left
- **center** (1) center
- **right** (2) right
- **position** (4) Absolute position clamped to canvas
- **absolute** (5) Absolute position
See #GstBaseTextOverlay:halignment
#### `height`
height of the source
Value type: #gint
#### `mute`
mute channel
Value type: #gboolean
See #volume:mute
#### `pattern`
Type of test pattern to generate
Valid values:
- **SMPTE 100% color bars** (0) smpte
- **Random (television snow)** (1) snow
- **100% Black** (2) black
- **100% White** (3) white
- **Red** (4) red
- **Green** (5) green
- **Blue** (6) blue
- **Checkers 1px** (7) checkers-1
- **Checkers 2px** (8) checkers-2
- **Checkers 4px** (9) checkers-4
- **Checkers 8px** (10) checkers-8
- **Circular** (11) circular
- **Blink** (12) blink
- **SMPTE 75% color bars** (13) smpte75
- **Zone plate** (14) zone-plate
- **Gamut checkers** (15) gamut
- **Chroma zone plate** (16) chroma-zone-plate
- **Solid color** (17) solid-color
- **Moving ball** (18) ball
- **SMPTE 100% color bars** (19) smpte100
- **Bar** (20) bar
- **Pinwheel** (21) pinwheel
- **Spokes** (22) spokes
- **Gradient** (23) gradient
- **Colors** (24) colors
See #videotestsrc:pattern
#### `posx`
x position of the stream
Value type: #gint
#### `posy`
y position of the stream
Value type: #gint
#### `text-width`
Resulting width of font rendering
Value type: #guint
See #GstBaseTextOverlay:text-width
#### `text-x`
Resulting X position of font rendering.
Value type: #gint
See #GstBaseTextOverlay:text-x
#### `text-y`
Resulting X position of font rendering.
Value type: #gint
See #GstBaseTextOverlay:text-y
#### `time-mode`
What time to show
Valid values:
- **buffer-time** (0) buffer-time
- **stream-time** (1) stream-time
- **running-time** (2) running-time
- **time-code** (3) time-code
See #timeoverlay:time-mode
#### `valignment`
Vertical alignment of the text
Valid values:
- **baseline** (0) baseline
- **bottom** (1) bottom
- **top** (2) top
- **position** (3) Absolute position clamped to canvas
- **center** (4) center
- **absolute** (5) Absolute position
See #GstBaseTextOverlay:valignment
#### `video-direction`
Video direction: rotation and flipping
Valid values:
- **GST_VIDEO_ORIENTATION_IDENTITY** (0) identity
- **GST_VIDEO_ORIENTATION_90R** (1) 90r
- **GST_VIDEO_ORIENTATION_180** (2) 180
- **GST_VIDEO_ORIENTATION_90L** (3) 90l
- **GST_VIDEO_ORIENTATION_HORIZ** (4) horiz
- **GST_VIDEO_ORIENTATION_VERT** (5) vert
- **GST_VIDEO_ORIENTATION_UL_LR** (6) ul-lr
- **GST_VIDEO_ORIENTATION_UR_LL** (7) ur-ll
- **GST_VIDEO_ORIENTATION_AUTO** (8) auto
- **GST_VIDEO_ORIENTATION_CUSTOM** (9) custom
See #GstVideoDirection:video-direction
#### `volume`
Volume of test signal
Value type: #gdouble
See #audiotestsrc:volume
#### `volume`
volume factor, 1.0=100%
Value type: #gdouble
See #volume:volume
#### `width`
width of the source
Value type: #gint

View file

@ -0,0 +1,221 @@
#### `alpha`
alpha of the stream
Value type: #gdouble
#### `color`
Color to use for text (big-endian ARGB).
Value type: #guint
See #GstBaseTextOverlay:color
#### `font-desc`
Pango font description of font to be used for rendering. See documentation of
pango_font_description_from_string for syntax.
Value type: #gchararray
See #GstBaseTextOverlay:font-desc
#### `foreground-color`
Foreground color to use (big-endian ARGB)
Value type: #guint
See #videotestsrc:foreground-color
#### `halignment`
Horizontal alignment of the text
Valid values:
- **left** (0) left
- **center** (1) center
- **right** (2) right
- **position** (4) Absolute position clamped to canvas
- **absolute** (5) Absolute position
See #GstBaseTextOverlay:halignment
#### `height`
height of the source
Value type: #gint
#### `outline-color`
Color to use for outline the text (big-endian ARGB).
Value type: #guint
See #GstBaseTextOverlay:outline-color
#### `pattern`
Type of test pattern to generate
Valid values:
- **SMPTE 100% color bars** (0) smpte
- **Random (television snow)** (1) snow
- **100% Black** (2) black
- **100% White** (3) white
- **Red** (4) red
- **Green** (5) green
- **Blue** (6) blue
- **Checkers 1px** (7) checkers-1
- **Checkers 2px** (8) checkers-2
- **Checkers 4px** (9) checkers-4
- **Checkers 8px** (10) checkers-8
- **Circular** (11) circular
- **Blink** (12) blink
- **SMPTE 75% color bars** (13) smpte75
- **Zone plate** (14) zone-plate
- **Gamut checkers** (15) gamut
- **Chroma zone plate** (16) chroma-zone-plate
- **Solid color** (17) solid-color
- **Moving ball** (18) ball
- **SMPTE 100% color bars** (19) smpte100
- **Bar** (20) bar
- **Pinwheel** (21) pinwheel
- **Spokes** (22) spokes
- **Gradient** (23) gradient
- **Colors** (24) colors
See #videotestsrc:pattern
#### `posx`
x position of the stream
Value type: #gint
#### `posy`
y position of the stream
Value type: #gint
#### `shaded-background`
Whether to shade the background under the text area
Value type: #gboolean
See #GstBaseTextOverlay:shaded-background
#### `text`
Text to be display.
Value type: #gchararray
See #GstBaseTextOverlay:text
#### `text-height`
Resulting height of font rendering
Value type: #guint
See #GstBaseTextOverlay:text-height
#### `text-width`
Resulting width of font rendering
Value type: #guint
See #GstBaseTextOverlay:text-width
#### `text-x`
Resulting X position of font rendering.
Value type: #gint
See #GstBaseTextOverlay:text-x
#### `text-y`
Resulting X position of font rendering.
Value type: #gint
See #GstBaseTextOverlay:text-y
#### `valignment`
Vertical alignment of the text
Valid values:
- **baseline** (0) baseline
- **bottom** (1) bottom
- **top** (2) top
- **position** (3) Absolute position clamped to canvas
- **center** (4) center
- **absolute** (5) Absolute position
See #GstBaseTextOverlay:valignment
#### `video-direction`
Video direction: rotation and flipping
Valid values:
- **GST_VIDEO_ORIENTATION_IDENTITY** (0) identity
- **GST_VIDEO_ORIENTATION_90R** (1) 90r
- **GST_VIDEO_ORIENTATION_180** (2) 180
- **GST_VIDEO_ORIENTATION_90L** (3) 90l
- **GST_VIDEO_ORIENTATION_HORIZ** (4) horiz
- **GST_VIDEO_ORIENTATION_VERT** (5) vert
- **GST_VIDEO_ORIENTATION_UL_LR** (6) ul-lr
- **GST_VIDEO_ORIENTATION_UR_LL** (7) ur-ll
- **GST_VIDEO_ORIENTATION_AUTO** (8) auto
- **GST_VIDEO_ORIENTATION_CUSTOM** (9) custom
See #GstVideoDirection:video-direction
#### `width`
width of the source
Value type: #gint
#### `x-absolute`
Horizontal position when using absolute alignment
Value type: #gdouble
See #GstBaseTextOverlay:x-absolute
#### `xpos`
Horizontal position when using clamped position alignment
Value type: #gdouble
See #GstBaseTextOverlay:xpos
#### `y-absolute`
Vertical position when using absolute alignment
Value type: #gdouble
See #GstBaseTextOverlay:y-absolute
#### `ypos`
Vertical position when using clamped position alignment
Value type: #gdouble
See #GstBaseTextOverlay:ypos

View file

@ -0,0 +1,16 @@
#### `border`
The border width
Value type: #guint
See #GESVideoTransition:border
#### `invert`
Whether the transition is inverted
Value type: #gboolean
See #GESVideoTransition:invert

View file

@ -0,0 +1,97 @@
#### `alpha`
alpha of the stream
Value type: #gdouble
#### `background-color`
Background color to use (big-endian ARGB)
Value type: #guint
See #videotestsrc:background-color
#### `foreground-color`
Foreground color to use (big-endian ARGB)
Value type: #guint
See #videotestsrc:foreground-color
#### `height`
height of the source
Value type: #gint
#### `pattern`
Type of test pattern to generate
Valid values:
- **SMPTE 100% color bars** (0) smpte
- **Random (television snow)** (1) snow
- **100% Black** (2) black
- **100% White** (3) white
- **Red** (4) red
- **Green** (5) green
- **Blue** (6) blue
- **Checkers 1px** (7) checkers-1
- **Checkers 2px** (8) checkers-2
- **Checkers 4px** (9) checkers-4
- **Checkers 8px** (10) checkers-8
- **Circular** (11) circular
- **Blink** (12) blink
- **SMPTE 75% color bars** (13) smpte75
- **Zone plate** (14) zone-plate
- **Gamut checkers** (15) gamut
- **Chroma zone plate** (16) chroma-zone-plate
- **Solid color** (17) solid-color
- **Moving ball** (18) ball
- **SMPTE 100% color bars** (19) smpte100
- **Bar** (20) bar
- **Pinwheel** (21) pinwheel
- **Spokes** (22) spokes
- **Gradient** (23) gradient
- **Colors** (24) colors
See #videotestsrc:pattern
#### `posx`
x position of the stream
Value type: #gint
#### `posy`
y position of the stream
Value type: #gint
#### `video-direction`
Video direction: rotation and flipping
Valid values:
- **GST_VIDEO_ORIENTATION_IDENTITY** (0) identity
- **GST_VIDEO_ORIENTATION_90R** (1) 90r
- **GST_VIDEO_ORIENTATION_180** (2) 180
- **GST_VIDEO_ORIENTATION_90L** (3) 90l
- **GST_VIDEO_ORIENTATION_HORIZ** (4) horiz
- **GST_VIDEO_ORIENTATION_VERT** (5) vert
- **GST_VIDEO_ORIENTATION_UL_LR** (6) ul-lr
- **GST_VIDEO_ORIENTATION_UR_LL** (7) ur-ll
- **GST_VIDEO_ORIENTATION_AUTO** (8) auto
- **GST_VIDEO_ORIENTATION_CUSTOM** (9) custom
See #GstVideoDirection:video-direction
#### `width`
width of the source
Value type: #gint

View file

@ -0,0 +1,83 @@
#### `alpha`
alpha of the stream
Value type: #gdouble
#### `fields`
Fields to use for deinterlacing
Valid values:
- **All fields** (0) all
- **Top fields only** (1) top
- **Bottom fields only** (2) bottom
- **Automatically detect** (3) auto
See #deinterlace:fields
#### `height`
height of the source
Value type: #gint
#### `mode`
Deinterlace Mode
Valid values:
- **Auto detection (best effort)** (0) auto
- **Force deinterlacing** (1) interlaced
- **Run in passthrough mode** (2) disabled
- **Auto detection (strict)** (3) auto-strict
See #deinterlace:mode
#### `posx`
x position of the stream
Value type: #gint
#### `posy`
y position of the stream
Value type: #gint
#### `tff`
Deinterlace top field first
Valid values:
- **Auto detection** (0) auto
- **Top field first** (1) tff
- **Bottom field first** (2) bff
See #deinterlace:tff
#### `video-direction`
Video direction: rotation and flipping
Valid values:
- **GST_VIDEO_ORIENTATION_IDENTITY** (0) identity
- **GST_VIDEO_ORIENTATION_90R** (1) 90r
- **GST_VIDEO_ORIENTATION_180** (2) 180
- **GST_VIDEO_ORIENTATION_90L** (3) 90l
- **GST_VIDEO_ORIENTATION_HORIZ** (4) horiz
- **GST_VIDEO_ORIENTATION_VERT** (5) vert
- **GST_VIDEO_ORIENTATION_UL_LR** (6) ul-lr
- **GST_VIDEO_ORIENTATION_UR_LL** (7) ur-ll
- **GST_VIDEO_ORIENTATION_AUTO** (8) auto
- **GST_VIDEO_ORIENTATION_CUSTOM** (9) custom
See #GstVideoDirection:video-direction
#### `width`
width of the source
Value type: #gint

View file

@ -0,0 +1,90 @@
#!/usr/bin/env python3
"""
Simple script to update the children properties information for
GESTrackElement-s that add children properties all the time
"""
import gi
import os
import sys
import textwrap
gi.require_version("Gst", "1.0")
gi.require_version("GObject", "2.0")
gi.require_version("GES", "1.0")
from gi.repository import Gst, GES, GObject
overrides = {
"GstFramePositioner": False,
"GstBaseTextOverlay": "timeoverlay",
"GstVideoDirection": "videoflip",
"GESVideoTestSource": "GESVideoTestSource",
"GESVideoTransition": "GESVideoTransition",
}
if __name__ == "__main__":
Gst.init(None)
GES.init()
os.chdir(os.path.realpath(os.path.dirname(__file__)))
tl = GES.Timeline.new_audio_video()
layer = tl.append_layer()
elements = []
def add_clip(c, add=True, override_name=None):
c.props.duration = Gst.SECOND
c.props.start = layer.get_duration()
layer.add_clip(c)
if add:
elements.extend(c.children)
else:
if override_name:
elements.append((c, override_name))
else:
elements.append(c)
add_clip(GES.UriClipAsset.request_sync(Gst.filename_to_uri(os.path.join("../../", "tests/check/assets/audio_video.ogg"))).extract())
add_clip(GES.TestClip.new())
add_clip(GES.TitleClip.new())
add_clip(GES.SourceClip.new_time_overlay(), False, "GESTimeOverlaySourceClip")
add_clip(GES.TransitionClip.new_for_nick("crossfade"), False)
for element in elements:
if isinstance(element, tuple):
element, gtype = element
else:
gtype = element.__gtype__.name
print(gtype)
with open(gtype + '-children-props.md', 'w') as f:
for prop in GES.TimelineElement.list_children_properties(element):
prefix = '#### `%s`\n\n' % (prop.name)
prefix_len = len(prefix)
lines = textwrap.wrap(prop.blurb, width=80)
doc = prefix + lines[0]
if GObject.type_is_a(prop, GObject.ParamSpecEnum.__gtype__):
lines += ["", "Valid values:"]
for value in prop.enum_class.__enum_values__.values():
lines.append(" - **%s** (%d) %s" % (value.value_name,
int(value), value.value_nick))
else:
lines += ["", "Value type: #" + prop.value_type.name]
typename = overrides.get(prop.owner_type.name, None)
if typename is not False:
if typename is None:
if GObject.type_is_a(prop.owner_type, Gst.Element):
typename = GObject.new(prop.owner_type).get_factory().get_name()
lines += ["", "See #%s:%s" % (typename, prop.name)]
if len(lines) > 1:
doc += '\n'
doc += '\n'.join(lines[1:])
print(doc + "\n", file=f)

5
docs/low_level.md Normal file
View file

@ -0,0 +1,5 @@
# Low level APIs
Those APIs should usually not be used unless you know
what you are doing, check other parts of the documentation
before deciding you should use one of those.

131
docs/meson.build Normal file
View file

@ -0,0 +1,131 @@
build_hotdoc = false
if meson.is_cross_build()
if get_option('doc').enabled()
error('Documentation enabled but building the doc while cross building is not supported yet.')
endif
message('Documentation not built as building it while cross building is not supported yet.')
subdir_done()
endif
required_hotdoc_extensions = ['gi-extension', 'gst-extension']
if gst_dep.type_name() == 'internal'
gst_proj = subproject('gstreamer')
plugins_cache_generator = gst_proj.get_variable('plugins_cache_generator')
else
plugins_cache_generator = find_program(join_paths(gst_dep.get_pkgconfig_variable('libexecdir'), 'gstreamer-' + apiversion, 'gst-plugins-doc-cache-generator'), required: false)
endif
plugins_cache = join_paths(meson.current_source_dir(), 'gst_plugins_cache.json')
if plugins_cache_generator.found()
plugins_doc_dep = custom_target('editing-services-doc-cache',
command: [plugins_cache_generator, plugins_cache, '@OUTPUT@', '@INPUT@'],
input: plugins,
output: 'gst_plugins_cache.json',
build_always_stale: true,
)
else
warning('GStreamer plugin inspector for documentation not found, can\'t update the cache')
endif
hotdoc_p = find_program('hotdoc', required: get_option('doc'))
if not hotdoc_p.found()
message('Hotdoc not found, not building the documentation')
subdir_done()
endif
hotdoc_req = '>= 0.11.0'
hotdoc_version = run_command(hotdoc_p, '--version').stdout()
if not hotdoc_version.version_compare(hotdoc_req)
if get_option('doc').enabled()
error('Hotdoc version @0@ not found, got @1@'.format(hotdoc_req, hotdoc_version))
else
message('Hotdoc version @0@ not found, got @1@, not building documentation'.format(hotdoc_req, hotdoc_version))
subdir_done()
endif
endif
hotdoc = import('hotdoc')
foreach extension: required_hotdoc_extensions
if not hotdoc.has_extensions(extension)
if get_option('doc').enabled()
error('Documentation enabled but gi-extension missing')
endif
message('@0@ extensions not found, not building documentation requiring it'.format(extension))
endif
endforeach
if not build_gir
if get_option('doc').enabled()
error('Documentation enabled but introspection not built.')
endif
message('Introspection not built, can\'t build the documentation')
subdir_done()
endif
build_hotdoc = true
ges_excludes = []
foreach f: ['gesmarshal.*',
'ges-internal.*',
'ges-auto-transition.*',
'ges-structured-interface.*',
'ges-structure-parser.*',
'ges-version.h',
'ges-smart-*',
'ges-command-line-formatter.*',
'ges-base-xml-formatter.h',
'gstframepositioner.*',
'lex.priv_ges_parse_yy.c',
'ges-parse-lex.[c]']
ges_excludes += [join_paths(meson.current_source_dir(), '..', '..', 'ges', f)]
endforeach
hotdoc = import('hotdoc')
libs_doc = [hotdoc.generate_doc('gst-editing-services',
project_version: apiversion,
extra_assets: [join_paths(meson.current_source_dir(), 'images')],
gi_c_sources: ges_sources + ges_headers,
gi_c_source_roots: [join_paths(meson.current_source_dir(), '../ges/')],
gi_sources: [ges_gir[0].full_path()],
gi_c_source_filters: ges_excludes,
sitemap: 'sitemap.txt',
index: 'index.md',
gi_index: 'index.md',
gi_smart_index: true,
gi_order_generated_subpages: true,
dependencies: [ges_dep],
disable_incremental_build: true,
)]
plugins_doc = []
list_plugin_res = run_command(python3, '-c',
'''
import sys
import json
with open("@0@") as f:
print(':'.join(json.load(f).keys()), end='')
'''.format(plugins_cache))
assert(list_plugin_res.returncode() == 0,
'Could not list plugins from @0@\n@1@\n@1@'.format(plugins_cache, list_plugin_res.stdout(), list_plugin_res.stderr()))
foreach plugin_name: list_plugin_res.stdout().split(':')
plugins_doc += [hotdoc.generate_doc(plugin_name,
project_version: apiversion,
sitemap: 'plugins/sitemap.txt',
index: 'plugins/index.md',
gst_index: 'plugins/index.md',
gst_smart_index: true,
gst_c_sources: ['../plugins/*/*.[ch]',],
dependencies: [gst_dep, plugins],
gst_order_generated_subpages: true,
gst_cache_file: plugins_cache,
gst_plugin_name: plugin_name,
)]
endforeach

0
docs/plugins/index.md Normal file
View file

7
docs/plugins/nle.md Normal file
View file

@ -0,0 +1,7 @@
---
short-description: Non Linear Engine
...
# Non Linear Engine
NLE is a set of elements to implement Non Linear multimedia editing.

1
docs/plugins/sitemap.txt Normal file
View file

@ -0,0 +1 @@
gst-index

440
docs/random/design Normal file
View file

@ -0,0 +1,440 @@
GStreamer Editing Services
--------------------------
This is a list of features and goals for the GStreamer Editing
Services.
Some features are already implemented, and some others not. When the
status is not specified, this means it is still not implemented but
might be investigated.
FUNDAMENTAL GOALS:
1) API must be easy to use for simple use-cases. Use and abuse
convenience methods.
2) API must allow as many use-cases as possible, not just the simple
ones.
FEATURES
Index of features:
* Project file load/save support (GESFormatter)
* Grouping/Linking of Multiple TrackObjects
* Selection support (Extension from Grouping/Linking)
* Effects support
* Source Material object
* Proxy support
* Editing modes (Ripple/Roll/Slip/Slide)
* Coherent handling of Content in different formats
* Video compositing and audio mixing
* Handling of alpha video (i.e. transparency)
* Faster/Tigher interaction with GNonLin elements
* Media Asset Management integration
* Templates
* Plugin system
* Project file load/save support (GESFormatter)
Status:
Implemented, requires API addition for all use-cases.
Problems:
Timelines can be stored in many different formats, we need to
ensure it is as easy/trivial as possible for users to load/save
those timelines.
Some timeline formats might have format-specific
sources/objects/effects which need to be handled in certain ways
and therefore provide their own classes.
The object that can save/load GESTimeline are Formatters.
Formatters can offer support for load-only/save-only formats.
There must be a list of well-known GES classes that all formatters
must be able to cope with. If a subclass of one of those classes is
present in a timeline, the formatter will do its best to store a
compatible information.
A Formatter can ask a pre-render of classes that it doesn't
understand (See Proxy section).
Formatters can provide subclasses of well-known GES classes when
filling in the timeline to offer format-specific features.
* Grouping/Linking of Multiple TrackObjects
Status:
Implemented, but doesn't have public API for controlling the
tracked objects or creating groups from TimelineObject(s)
Problems:
In order to make the usage of timelines at the Layer level as easy
as possible, we must be able to group any TrackObject together as
one TimelineObject.
The base GESTimelineObject keeps a reference to all the
GESTrackObjects it is controlling. It contains a mapping of the
position of those track objects relatively to the timeline objects.
TrackObjects will move and be modified synchronously with the
TimelineObject, and vice-versa.
TrackObjects can be 'unlocked' from the changes of its controlling
TimelineObject. In this case, it will not move and be modified
synchronously with the TimelineObject.
* Selection support (Extension from Grouping/Linking)
Problems:
In order to make user-interface faster to write, we must have a way
to create selections of user-selected TimelineObject(s) or
TrackObject(s) to move them together.
This should be able to do by creating a non-visible (maybe not even
inserted in the layer?) TimelineObject.
* Effects support
Status:
Partially Implemented, requires API addition for all use-cases.
Problems:
In order for users to apply multimedia effects in their timelines,
we need an API to search, add and control those effects.
We must be able to provide a list of effects available on the system
at runtime.
We must be able to configure effects through an API in GES without
having to access the GstElements properties directly.
We should also expose the GstElements contained in an effect so it
is possible for people to control their properties as they wish.
We must be able to implement and handle complex effects directly in
GES.
We must be able to configure effects through time -> Keyframes
without duplicating code from GStreamer.
* Source Material object
Problems:
Several TimelineSource for a same uri actually share a lot
in common. That information will mostly come from GstDiscoverer,
but could also contain extra information provided by 3rd party
modules.
The information regarding the various streams (and obtained through
optionally running GstDiscoverer) is not stored and has to be
re-analyzed else where.
Definition:
Material: n, The substance or substances out of which a thing is or
can be made.
In order to avoid duplicating that information in every single
TimelineSource, a 'Material' object needs to be made available.
A Material object contains all the information which is independent
of the usage of that material in a timeline.
A Material contains the list of 'streams' that can be provided with
as much information as possible (ex: contains audio and video
streams with full caps information, or better yet the output of
GstDiscoverer).
A Material contains the various Metadata (author, title, origin,
copyright ,....).
A Material object can specify the TimelineSource class to use in a
Layer.
* Proxy support
Problems:
A certain content might be impossible to edit on a certain setup
due to many reasons (too complex to decode in realtime, not in
digital format, not available locally, ...).
In order to be able to store/export timelines to some formats, one
might need to have to create a pre-render of some items of the
timeline, while retaining as much information as possible.
Content here is not limited to single materials, it could very well
be a complex combination of materials/effects like a timeline or a
collection of images.
To solve this problem, we need a notion of ProxyMaterial.
It is a subclass of Material and as such provides all the same
features as Material.
It should be made easy to create one from an existing TimelineSource
(and it's associated Material(s)), with specifiable rendering
settings and output location.
The user should have the possibility to switch from Proxy materials
to original (in order to use the lower
resolution/quality/... version for the editing phase and the
original material for final rendering phase).
Requires:
GESMaterial
* Editing modes (Ripple/Roll/Slip/Slide)
Status:
Not implemented.
Problems:
Most editing relies on heavy usage of 4 editing tools which editors
will require. Ripple/Roll happen on edit points (between two clips)
and Slip/Slide happen on a clip.
The Ripple tool allows you to modify the beginning/end of a clip
and move the neighbour accordingly. This will change the overall
timeline duration.
The Roll tool allows you to modify the position of an editing point
between two clips without modifying the inpoint of the first clip
nor the out-point of the second clip. This will not change the
overall timeline duration.
The Slip tool allows you to modify the in-point of a clip without
modifying it's duration or position in the timeline.
The Slide tool allows you to modify the position of a clip in a
timeline without modifying it's duration or it's in-point, but will
modify the out-point of the previous clip and in-point of the
following clip so as not to modify the overall timeline duration.
These tools can be used both on TimelineObjects and on
TrackObjects, we need to make sure that changes are propagated
properly.
* Coherent handling of Content in different formats
Problems:
When mixing content in different format (Aspect-Ratio, Size, color
depth, number of audio channels, ...), decisions need to be made on
whether to conform the material to a common format or not, and on
how to conform that material.
Conforming the material here means bringing it to a common format.
All the information regarding the contents we are handling are
stored in the various GESMaterial. The target format is also known
through the caps of the various GESTracks involved. The Material and
track output caps will allow us to make decisions on what course of
action to take.
By default content should be conformed to a good balance of speed
and avoid loss of information.
Ex: If mixing a 4:3 video and a 16:9 video with a target track
aspect ratio of 4:3, we will make the width of the two videos
be equal without distorting their respective aspect-ratios.
Requires:
GESMaterial
See also:
Video compositing and audio mixing
* Video compositing and audio mixing
Status:
Not implemented. The bare minimum to implement are the static
absolute property handling. Relative/variable properties and group
handling can be done once we know how to handle object grouping.
Problems:
Editing requires not only a linear combination of cuts and
sequences, but also mixing various content/effect at the same
time.
Audio and Video compositing/mixing requires having a set of base
properties for all sources that indicate their positioning in the
final composition.
Audio properties
* Volume
* Panning (or more generally positioning and up-/down-mixing for
multi-channel).
Video properties
* Z-layer (implicit through priority property)
* X,Y position
* Vertical and Horizontal scaling
* Global Alpha (see note below about alpha).
A big problem with compositing/mixing is handling positioning that
could change due to different input/output format AND avoiding any
quality loss.
Example 1 (video position and scale/aspect-ratio changes):
A user puts a 32x24 logo video at position 10,10 on a 1280x720
video. Later on the user decides to render the timeline to a
different resolution (like 1920x1080) or aspect ratio (4:3 instead
of 16:9).
The overlayed logo should stay at the same relative position
regardless of the output format.
Example 2 (video scaling):
A user decides to overlay a video logo which is originally a
320x240 video by scaling it down to 32x24 on top of a 1280x720
video. Later on the user decides to render a 1920x1080 version of
the timeline.
The resulting rendered 1920x1080 video shall have the overlay
video located at the exact relative position and using a 64x48
downscale of the original overlay video (i.e. avoiding a
640x480=>32x24=>64x48 double-scaling).
Example 3 (audio volume):
A user adjusts the commentary audio track and the soundtrack audio
track based on the volume of the various videos playing. Later on
the user wants to adjust the overall audio volume in order for the
final output to conform to a target RMS/peak volume.
The resulting relative volumes of each track should be the same
WITHOUT any extra loss of audio quality (i.e. avoiding a
downscale/upscale lossy volume conversion cycle).
Example 4 (audio positioning):
A user adjusts the relative panning/positioning of the commentary,
soundtrack and sequence for a 5.1 mixing. Later on he decides to
make a 7.1 and a stereo rendering.
The resulting relative positioning should be kept as much as
possible (left/right downmix and re-positioning for extra 2
channels in the case of 7.1 upmixing) WITHOUT any extra loss in
quality.
Create a new 'CompositingProperties' object for audio and video
which is an extensible set of properties for media-specific
positioning. This contains the properties mentionned above.
Add the CompositingProperties object to the base GESTrackObject
which points to the audio or video CompositingProperties
object (depending on what format that object is handling).
Provide convenience functions to retrieve and set the audio or video
compositing properties of a GESTrackObject. Do the same for the
GESTimelineObject, which proxies it to the relevant GESTrackObject.
Create a new GESTrack{Audio|Video}Compositing GstElement which will
be put in each track as a priority 0 expandable NleOperation.
That object will be able to figure out which
mixing/scaling/conversion elements to use at any given time by
inspecting:
* The various GESTrackObject Compositing Properties
* The various GESTrackObject GESMaterial stream properties
* The GESTrack target output GstCaps
The properties values could be both set/stored as 'relative' values
and as 'absolute' values in order to handle any input/output formats
or setting.
Objects that are linked/grouped with others have their properties
move in sync with each other. (Ex: If an overlay logo is locked to a
video, it will scale/move/be-transparent in sync with the video on
which it is overlayed).
Objects that are not linked/grouped to other objects have their
properties move in sync with the target format. If the target format
changes, all object positioning will change relatively to that
format.
Requires:
GESMaterial
See also:
Coherent handling of Content in different formats
* Handling of alpha video (i.e. transparency)
Problem:
Some streams will contain partial transparency (overlay
logos/videos, bluescreen, ...).
Those streams need to be handle-able by the user just like
non-alpha videos without losing the transparency regions (i.e. it
should be properly blended with the underlying regions).
* Faster/Tighter interaction with GNonLin elements
Problems:
A lot of properties/concepts need to be duplicated at the GES level
since the only way to communicate with the GNonLin elements is
through publically available APIs (GObject and GStreamer APIs).
The GESTrackObject for example has to duplicate exactly the same
properties as NleObject for no reason.
Other properties are also expensive to re-compute and also become
non-MT-safe (like computing the exact 'tree' of objects at a
certain position in a NleComposition).
Merge the GES and GNonLin modules together into one single module,
and keep the same previous API for both for backward compatibility.
Add additional APIs to GNonLin which GES can use.
* Media Asset Management integration
(Track, Search, Browse, Push content) TBD
* Templates
Problem:
In order to create as quickly as possible professional-looking
timelines, we need to provide a way to create 'templates' which
users can select and have an automatic timeline 'look' used.
This will allow users to be able to quickly add their clips, set
titles and have a timeline with a professional look. This is
similar to the same feature that iMovie offers both on desktop and
iOS.
* Plugin system
Problem:
All of GES classes are made in such a way that creating new
sources, effects, templates, formatters, etc... can be easily added
either to the GES codebase itself or to applications.
But in order to provide more features without depending on GES
releases, limit those features to a single application, and in
order to provide 'closed'/3rd party features, we need to implement
a plugin system so one can add new features.
Use a registry system similar to GStreamer.

18
docs/random/lifecycle Normal file
View file

@ -0,0 +1,18 @@
Lifecycle of a Timeline/Track Object
* Adding a TimelineObject to a Layer
(tlobj:timelineobject, trobj:trackobject)
ges_timeline_layer_add_object(layer, tlobj)
signal_emit "object-added", layer, tlobj
GESTimeline receives signal
for each TRACK {
ges_timeline_object_create_track_objects(tlobj, TRACK)
trobj = GESTimelineObject::create_track_objects
ges_track_add_object(TRACK, trobj)
ges_track_object_set_track(troj, TRACK)
nleobj = GESTrackObject::create_gnl_object
ges_timeline_object_fill_track_object(tlobj, trobj, nleobj)
GESTimelineObject::fill_track_object

99
docs/random/mapping.txt Normal file
View file

@ -0,0 +1,99 @@
Mapping Timeline position to Track position
-------------------------------------------
TrackObject/TimelineObject basic properties (hereafter position):
start
duration
in-point
priority
Use Cases:
A TimelineObject tracks one or many TrackObject(s).
When the TimelineObject position is modified we might need
to cascade those changes to the controlled TrackObject(s) if those
TrackObject(s) are 'locked' to the TimelineObject.
If we modify the positions of a TrackObject that TrackObject is
'locked' to the TimelineObject, we need to ensure all the other
co-related TrackObject belong to the same TimelineObject are moved in
the same way.
A TrackObject can be temporarily 'unlocked' from its TimelineObject,
so as to move it independently, and then 'locked' back to it. This
can allow moves, like shifting audio trackobject in relation to the
video trackobject (to fix sync issues) and then 'lock' them back so
as to be able to move them as one entity thereafter.
When adding TimelineOverlay(s) or TimelineEffect(s) on a
TimelineObject, we need to ensure the TrackObject(s) that those extra
effects will create can be added with specific priority offsets, in
such a way that they always end up "on top" of the TimelineObject's
existing tracked TrackObject(s).
When a controlled TrackObject is being moved when 'unlocked', we need
to make sure the duration/height of the TimelineObject is updated
accordingly. Ex : moving a TrackObject down by one priority should
increase the TimelineObject "heigh" property by 1.
A TimelineObject might want to have a tighter control over which
Track(s) each of the TrackObjects it is controlling are going. This
is more obvious in the case of timeline with multiple Tracks of the
same kind, or if a TimelineObject can produce multiple TrackObjects
of the same media type (ex: file with multiple audio tracks).
Main Problem:
There needs to be a mapping between the TimelineObject basic
properties and its controlled TrackObject(s) position.
Design:
The TimelineObject listen to TrackObject 'notify' signals
When it sets a property on its trackobjects, it 'ignores' all
notifications that happen while setting them.
Setting a property on a TrackObject will see its property changed,
and then it emits a notify with the modified property.
TrackObject::locked
ges_track_object_set_locked()
ges_track_object_is_locked()
Mapping {
GESTrackObject *object;
gint64 start_offset;
gint64 duration_offset;
gint64 inpoint_offset;
gint32 priority_offset;
/* Track ??? */
}
P : property
V : value
TimelineObject set_property(P,V)
ignore_notifies = TRUE
parent.P = V
foreach child in trackobjects:
if child.is_locked():
child.set_property(P, parent.P + mapping(child).P_offset)
ignore_notifies = FALSE
TimelineObject child 'notify::P' handler:
if ignore_notifies:
return
if not child.is_locked():
mapping(child).P_offset = timeline.P - child.P
else:
TimelineObject.set_property(P, child value + mapping(child).P_offset)
TrackObject set_property(P, V)
update the property locally (P = V)
emit 'notify::P' signal
TODO : When do we resync the parent values to have minimal offsets ?

View file

@ -0,0 +1,103 @@
<!DOCTYPE html>
<html>
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" >
<title> Rework the GStreamer Editing Services class hierarchy </title>
<xmp theme="cerulean" style="display:none;">
Reasoning:
----------
All the time (position) related concepts are shared between GESTimelineObject and GESTrackObject
and currently are repeated at the 2 levels.
Moreover, if we want to add the concept of Group we end up with something quite similare to the current
GESTimelineObject but that contains GESTimelineObject-s instead of GESTrackObject-s so we could share
those informations creating a new class aiming at containing the objects that have that
notion of timing.
At the same time, we want to clarify namings. First we should remove the word Object in class names,
we have been told various times that it sounds just "wrong" for people as Objects are instances and there
we are talking about Classes.
Class Hierarchy:
-------------
<pre><code>
<table>
<tr>
<td>
Before:
-------
GESTimelineObject
GESTimelineSource
GESCustomTimelineSource
GESTimelineTestSource
GESTimelineFileSource
GESTimelineTitleSource
GESTimelineOperation
GESTimelineOverlay
GESTimelineTextOverlay
GESTimelineTransition
GESTimelineTransition
GESTimelineEffect
GESTimelineParseLaunchEffect
GESTimelineLayer
GESSimpleTimelineLayer
GESTrackObject
GESTrackSource
GESTrackAudioTestSource
GESTrackFileSource
GESTrackImageSource
GESTrackTitleSource
GESTrackVideoTestSource
GESTrackOperation
GESTrackTransition
GESTrackAudioTransition
GESTrackVideoTransition
GESTrackEffect
GESTrackParseLaunchEffect
GESTrackTextOverlay
</td>
<td>
After:
-------
GESTimelineElement
GESContainer
GESClip
GESSourceClip
GESCustomSourceClip
GESTestClip
GESUriClip
GESTitleClip
GESOperationClip
GESOverlayClip
GESTextOverlayClip
GESBaseTransitionClip
GESTransitionClip
GESBaseEffectClip
GESEffectClip
GESClipGroup
GESTrackElement
GESSource
GESAudioTestSource
GESUriSource
GESImageSource
GESTitleSource
GESVideoTestSource
GESOperation
GESTransition
GESAudioTransition
GESVideoTransition
GESBaseEffect
GESEffect
GESTextOverlay
</td>
</tr>
</table>
</code></pre>
</xmp>
<script src="http://strapdownjs.com/v/0.1/strapdown.js"></script>
</html>

143
docs/random/scenarios Normal file
View file

@ -0,0 +1,143 @@
SCENARIOS
* Adding a TimelineObject to a TimelineLayer
--------------------------------------------
* Create a Timeline
* Create a Track
* Add the track to the Timeline (==> ges_timeline_add_track (track);)
The Timeline adds the Track to itself (i.e. gst_bin_add())
'track-added' is emitted
* Create a TimelineLayer
* Add the TimelineLayer to the Timeline (ges_timeline_add_layer (layer);)
The Timeline takes a reference on the layer and stores it
The Timeline tells the TimelineLayer that it now belongs to the given Timeline (weak reference)
==> ges_timeline_layer_set_timeline ();
'layer-added' is emitted
* Create a TimelineObject
* Add the TimelineObject to the TimelineLayer (ges_timeline_layer_add_object (object);)
The TimelineLayer takes a reference on the TimelineObject and stores it
The timelineLayer tells the TimelineObject that it now belongs to the given layer (weak reference)
==> ges_timeline_object_set_layer ();
'object-added' is emitted by TimelineLayer
The Timeline requests a new TrackObject from the new TimelineObject for each Track
==> ges_timeline_object_create_track_object (track)
The TimelineObject calls the 'create_track_object' virtual method with the given track
Example implementation
Create a GESTrackSource
(GESTimelineObject is a constructor property of track objects)
A GESTrackObject CAN NOT EXIST WITHOUT A GESTimelineObject !
The Timeline adds the newly created TrackObject to the Track
==> ges_track_add_object (track, trackobject);
Set the track on the TrackObject
==> ges_track_object_set_track (track)
The GESTrackObject can create the NleObject
Methods
-------
[ GESTimeline ]
* gboolean
ges_timeline_add_track (GESTimeline * timeline, GESTrack * track);
* The Timeline adds the track to itself (gst_bin_add ()) # reference implicitely taken
* The Timeline adds the track to its list of tracked tracks
* The Timeline sets the Timeline on the track
=> ges_track_set_timeline (GESTrack * track, GESTimeline * timeline);
Just sets the timeline field of the track.
* emits 'track-added'
* gboolean
ges_timeline_add_layer (GESTimeline * timeline, GESTimelineLayer * layer);
* The Timeline takes a reference on the layer and stores it
* The Timeline tells the Layer that it now belongs to the given Timeline
=> ges_timeline_layer_set_timeline (GESTimelineLayer * layer, GESTimeline * timeline);
Just sets the timeline field of the layer.
* Connect to the layer's 'object-added' signal
* emits 'layer-added'
* GESTimeline's
callback for GESTimelineLayer::object-added (GESTimelineLayer * layer, GESTimelineObject * object);
* For each GESTrack in the Timeline:
* The timeline requests a new TrackObject for the new TimelineObject for each Track
trackobj = ges_timeline_object_create_track_object (timelineobj, track);
* The timeline adds the newly created TrackObject to the track
ges_track_add_object (track, trackobj);
[ GESTimelineLayer ]
* gboolean
ges_timeline_layer_add_object (GESTimelineLayer * layer, GESTimelineObject * object);
* The TimelineLayer takes a reference on the TimelineObject and stores it
* The TimelineLayer tells the TimelineObject it now belongs to the given Layer
=> ges_timeline_object_set_layer (GESTimelineObject * object, GESTimelineLayer * layer);
Just sets the layer field of the timeline object.
* emits 'object-added'
[ GESTimelineObject ]
* GESTrackObject *
ges_timeline_object_create_track_object (GESTimelineObject * object, GESTrack * track);
* The TimelineObject calls the 'create_track_object' virtual method
* The TimelineObject sets the TimelineObject on the new TrackObject
=> ges_track_object_set_timeline_object (track_object, timeline_object);
Just sets the timeline-object field of the TrackObject
* Return the newly created GESTrackObject
* Virtual-method for GESTimelineObject::create_track_object (GESTimelineObject * object, GESTrack * track);
* Create a track object of the proper type
Ex (for a source) :
return ges_track_source_new();
* gboolean
ges_timeline_object_fill_track_object (GESTimelineObject *tlo, GESTrackObject *tro, GstElement *nleobj);
* up to the implementation :)
[ GESTrack ]
* gboolean
ges_track_add_object (GESTrack * track, GESTrackObject * object);
* Set the track on the track_object
ges_track_object_set_track (object, track);
* Add the NleObject of the TrackObject to the composition
gst_bin_add (track->composition, object->nleobject);
[ GESTrackObject ]
* gboolean
ges_track_object_set_track (GESTrackObject * object, GESTrack * track);
* Set the track field of the TrackObject
* if no NleObject is available yet:
* Call the 'create_gnl_object' virtual method
* Virtual-method for GESTrackObject::create_gnl_object
* Create a NleObject of the proper type
Ex : nleobject = gst_element_factory_make("nlesource", NULL);
* Ask the TimelineObject to fill in the NleObject
=> ges_timeline_object_fill_track_object (GESTimelineObject * tlo, GESTrackObject * tro, GstElement * nleobj);

65
docs/sitemap.txt Normal file
View file

@ -0,0 +1,65 @@
gi-index
ges.h
ges-timeline.h
ges-layer.h
ges-clip.h
ges-uri-clip.h
ges-title-clip.h
ges-test-clip.h
ges-time-overlay-clip.h
ges-effect-clip.h
ges-transition-clip.h
ges-pipeline.h
ges-project.h
base-classes.md
ges-timeline-element.h
ges-container.h
ges-track.h
ges-audio-track.h
ges-video-track.h
ges-asset.h
ges-uri-asset.h
ges-clip-asset.h
ges-effect-asset.h
ges-track-element-asset.h
ges-source-clip-asset.h
ges-effect.h
ges-extractable.h
ges-group.h
ges-meta-container.h
ges-marker-list.h
ges-formatter.h
ges-xml-formatter.h
ges-track-element.h
ges-video-source.h
ges-audio-source.h
ges-audio-test-source.h
ges-audio-uri-source.h
ges-video-uri-source.h
ges-video-test-source.h
ges-title-source.h
ges-text-overlay.h
ges-gerror.h
ges-types.h
ges-enums.h
ges-utils.h
low_level.md
ges-base-xml-formatter.h
ges-command-line-formatter.h
ges-audio-transition.h
ges-base-effect-clip.h
ges-base-effect.h
ges-base-transition-clip.h
ges-operation-clip.h
ges-operation.h
ges-overlay-clip.h
ges-source-clip.h
ges-source.h
ges-text-overlay-clip.h
ges-transition.h
ges-video-transition.h
ges-prelude.h
deprecated.md
ges-pitivi-formatter.h
ges-image-source.h
ges-multi-file-source.h

2
docs/version.entities.in Normal file
View file

@ -0,0 +1,2 @@
<!ENTITY GST_API_VERSION "@GST_API_VERSION@">
<!ENTITY GES_VERSION "@VERSION@">

563
docs/working-diagrams.svg Normal file
View file

@ -0,0 +1,563 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="1052.3622"
height="744.09448"
id="svg2"
sodipodi:version="0.32"
inkscape:version="0.46"
sodipodi:docname="working-diagrams.svg"
inkscape:output_extension="org.inkscape.output.svg.inkscape"
version="1.0">
<defs
id="defs4">
<marker
inkscape:stockid="Arrow2Lend"
orient="auto"
refY="0"
refX="0"
id="Arrow2Lend"
style="overflow:visible">
<path
id="path3227"
style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
d="M 8.7185878,4.0337352 L -2.2072895,0.016013256 L 8.7185884,-4.0017078 C 6.97309,-1.6296469 6.9831476,1.6157441 8.7185878,4.0337352 z"
transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
</marker>
<marker
inkscape:stockid="Arrow1Lend"
orient="auto"
refY="0"
refX="0"
id="Arrow1Lend"
style="overflow:visible">
<path
id="path3209"
d="M 0,0 L 5,-5 L -12.5,0 L 5,5 L 0,0 z"
style="fill-rule:evenodd;stroke:#000000;stroke-width:1pt;marker-start:none"
transform="matrix(-0.8,0,0,-0.8,-10,0)" />
</marker>
<marker
inkscape:stockid="Arrow1Lstart"
orient="auto"
refY="0"
refX="0"
id="Arrow1Lstart"
style="overflow:visible">
<path
id="path3206"
d="M 0,0 L 5,-5 L -12.5,0 L 5,5 L 0,0 z"
style="fill-rule:evenodd;stroke:#000000;stroke-width:1pt;marker-start:none"
transform="matrix(0.8,0,0,0.8,10,0)" />
</marker>
<inkscape:perspective
sodipodi:type="inkscape:persp3d"
inkscape:vp_x="0 : 526.18109 : 1"
inkscape:vp_y="0 : 1000 : 0"
inkscape:vp_z="744.09448 : 526.18109 : 1"
inkscape:persp3d-origin="372.04724 : 350.78739 : 1"
id="perspective10" />
</defs>
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
gridtolerance="10000"
guidetolerance="10"
objecttolerance="10"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="1.4142136"
inkscape:cx="358.45699"
inkscape:cy="275.3079"
inkscape:document-units="px"
inkscape:current-layer="layer1"
showgrid="false"
inkscape:window-width="1680"
inkscape:window-height="1031"
inkscape:window-x="0"
inkscape:window-y="0"
inkscape:snap-global="true">
<inkscape:grid
type="xygrid"
id="grid2383"
visible="true"
enabled="true" />
</sodipodi:namedview>
<metadata
id="metadata7">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Layer 1"
inkscape:groupmode="layer"
id="layer1">
<rect
style="opacity:0.5;fill:#aaeeff;fill-opacity:1;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
id="rect2385"
width="339.37469"
height="310.25867"
x="700.35718"
y="493.7818"
rx="9.689147"
ry="12.085826"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575" />
<text
xml:space="preserve"
style="font-size:18px;font-style:normal;font-weight:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="710"
y="514.09448"
id="text2391"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575"><tspan
sodipodi:role="line"
id="tspan2393"
x="710"
y="514.09448">GESTimeline</tspan></text>
<flowRoot
xml:space="preserve"
id="flowRoot3165"
style="font-size:18px;font-style:normal;font-weight:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Bitstream Vera Sans"><flowRegion
id="flowRegion3167"><rect
id="rect3169"
width="288"
height="208"
x="50"
y="129.36218" /></flowRegion><flowPara
id="flowPara3171" /></flowRoot> <g
id="g5556"
transform="translate(690,9.5885)"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575"
style="stroke-width:2;stroke-miterlimit:4;stroke-dasharray:none">
<rect
y="532.77368"
x="29.296322"
height="49.58847"
width="300.70367"
id="rect3179"
style="opacity:1;fill:#aa87de;fill-opacity:1;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" />
<text
id="text3181"
y="564.86188"
x="139.24094"
style="font-size:28px;font-style:normal;font-weight:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;font-family:Bitstream Vera Sans"
xml:space="preserve"><tspan
y="564.86188"
x="139.24094"
id="tspan3183"
sodipodi:role="line">Layer</tspan></text>
</g>
<rect
style="fill:#8dd35f;fill-opacity:1;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
id="rect3185"
width="299.99881"
height="49.589645"
x="719.99939"
y="652.36157"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575" />
<text
xml:space="preserve"
style="font-size:28px;font-style:normal;font-weight:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="833.7522"
y="687.59491"
id="text3187"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575"><tspan
sodipodi:role="line"
id="tspan3189"
x="833.7522"
y="687.59491">Track</tspan></text>
<path
style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;marker-end:url(#Arrow2Lend);stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
d="M 1019.8707,676.85631 L 1079.4001,676.85631"
id="path3201"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575" />
<rect
style="fill:#8dd35f;fill-opacity:1;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
id="rect3195"
width="299.99881"
height="49.589645"
x="719.99939"
y="732.36157"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575" />
<text
xml:space="preserve"
style="font-size:28px;font-style:normal;font-weight:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="833.7522"
y="767.59491"
id="text3197"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575"><tspan
sodipodi:role="line"
id="tspan3199"
x="833.7522"
y="767.59491">Track</tspan></text>
<text
xml:space="preserve"
style="font-size:18px;font-style:normal;font-weight:normal;text-align:center;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="-566.94543"
y="659.29633"
id="text5573"
transform="matrix(0,-1,1,0,0,0)"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575"><tspan
sodipodi:role="line"
id="tspan5575"
x="-566.94543"
y="659.29633">User</tspan><tspan
sodipodi:role="line"
x="-566.94543"
y="681.79633"
id="tspan5577">visible</tspan></text>
<text
xml:space="preserve"
style="font-size:18px;font-style:normal;font-weight:normal;text-align:center;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="-710.63367"
y="670"
id="text5579"
transform="matrix(0,-1,1,0,0,0)"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575"><tspan
sodipodi:role="line"
x="-710.63367"
y="670"
id="tspan5583">Medias</tspan></text>
<path
style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;marker-end:url(#Arrow2Lend);stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
d="M 1019.6067,757.52077 L 1079.1361,757.52077"
id="path5615"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575" />
<path
style="opacity:0.5;fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:2.91023302;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:8.73069881, 2.91023294;stroke-dashoffset:0;stroke-opacity:1"
d="M 700,619.09448 L 1039.0223,619.09448"
id="path5619"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575" />
<rect
style="fill:#aa87de;fill-opacity:1;stroke:#000000;stroke-width:1.48319197;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
id="rect2507"
width="489.8678"
height="119.87508"
x="40.067669"
y="83.888557" />
<rect
style="fill:#8dd35f;fill-opacity:1;stroke:#000000;stroke-width:1.35438466;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
id="rect3304"
width="490.13684"
height="99.903282"
x="39.870644"
y="434.19138" />
<rect
style="fill:#8dd35f;fill-opacity:1;stroke:#000000;stroke-width:1.65954936;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
id="rect3306"
width="489.94894"
height="150.05229"
x="40.026497"
y="254.01666" />
<rect
style="fill:#00c4ff;fill-opacity:1;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
id="rect3308"
width="160"
height="80"
x="60.08099"
y="113.9472"
rx="10" />
<path
style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:2.11796093;stroke-linecap:butt;stroke-linejoin:miter;marker-start:none;marker-end:url(#Arrow2Lend);stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
d="M 40.384902,60.094197 L 527.99322,60.094197"
id="path3312" />
<text
xml:space="preserve"
style="font-size:40px;font-style:normal;font-weight:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="40"
y="55.094482"
id="text4368"><tspan
sodipodi:role="line"
id="tspan4370"
x="40"
y="55.094482">Time</tspan></text>
<path
style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:1.97574663;stroke-linecap:butt;stroke-linejoin:miter;marker-end:url(#Arrow2Lend);stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
d="M 529.9953,484.17782 L 588.00008,484.17782"
id="path4372"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575" />
<rect
style="fill:#00c4ff;fill-opacity:1;stroke:#000000;stroke-width:1.8037442;stroke-linecap:butt;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
id="rect4374"
width="129.82126"
height="80.196259"
x="220.0939"
y="113.84907"
rx="10" />
<rect
style="fill:#00c4ff;fill-opacity:1;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
id="rect4376"
width="160"
height="80"
x="349.95801"
y="113.99493"
rx="10" />
<rect
style="fill:#aaa800;fill-opacity:1;stroke:#000000;stroke-width:1.73008239;stroke-linecap:butt;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
id="rect4378"
width="160.25104"
height="59.76992"
x="59.955471"
y="332.76434"
rx="10" />
<rect
style="fill:#aaa800;fill-opacity:1;stroke:#000000;stroke-width:1.55953789;stroke-linecap:butt;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
id="rect4380"
width="129.84395"
height="59.940464"
x="220.08257"
y="332.67908"
rx="10" />
<rect
style="fill:#aaa800;fill-opacity:1;stroke:#000000;stroke-width:1.55953789;stroke-linecap:butt;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
id="rect4382"
width="129.84395"
height="59.940464"
x="220.08257"
y="272.64929"
rx="10" />
<rect
style="fill:#aaa800;fill-opacity:1;stroke:#000000;stroke-width:1.72976816;stroke-linecap:butt;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
id="rect4384"
width="160.19199"
height="59.770233"
x="349.862"
y="332.81192"
rx="10" />
<rect
style="fill:#aaa800;fill-opacity:1;stroke:#000000;stroke-width:1.73008239;stroke-linecap:butt;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
id="rect4386"
width="160.25104"
height="59.76992"
x="59.955471"
y="465.89206"
rx="10" />
<rect
style="fill:#aaa800;fill-opacity:1;stroke:#000000;stroke-width:1.55953789;stroke-linecap:butt;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
id="rect4388"
width="129.84395"
height="59.940464"
x="220.08257"
y="465.80679"
rx="10" />
<text
xml:space="preserve"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:125%;writing-mode:lr-tb;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Georgia;-inkscape-font-specification:Georgia Italic"
x="140.53998"
y="159.21556"
id="text4392"
sodipodi:linespacing="125%"><tspan
sodipodi:role="line"
id="tspan4394"
x="140.53998"
y="159.21556"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:125%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic">FileSource A</tspan></text>
<text
xml:space="preserve"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:125%;writing-mode:lr-tb;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Georgia;-inkscape-font-specification:Georgia Italic"
x="429.58691"
y="145.30821"
id="text4396"
sodipodi:linespacing="125%"><tspan
sodipodi:role="line"
id="tspan4398"
x="429.58691"
y="145.30821"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:125%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic">FileSource C</tspan><tspan
sodipodi:role="line"
x="429.58691"
y="170.30821"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:125%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic"
id="tspan4410">(muted)</tspan></text>
<text
xml:space="preserve"
style="font-size:40px;font-style:normal;font-weight:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="283"
y="158.09448"
id="text4400"><tspan
sodipodi:role="line"
id="tspan4402"></tspan></text>
<text
xml:space="preserve"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:125%;writing-mode:lr-tb;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Georgia;-inkscape-font-specification:Georgia Italic"
x="285.38052"
y="144.69408"
id="text4404"
sodipodi:linespacing="125%"><tspan
sodipodi:role="line"
id="tspan4406"
x="285.38052"
y="144.69408"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:125%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic">FileSource B</tspan><tspan
sodipodi:role="line"
x="285.38052"
y="169.69408"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:125%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic"
id="tspan4408">(+fx)</tspan></text>
<text
xml:space="preserve"
style="font-size:40px;font-style:normal;font-weight:normal;line-height:125%;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="42.373047"
y="274.09448"
id="text4412"
sodipodi:linespacing="125%"><tspan
sodipodi:role="line"
id="tspan4414"
x="42.373047"
y="274.09448"
style="font-size:20px;font-style:oblique;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:start;line-height:125%;writing-mode:lr-tb;text-anchor:start;font-family:Bitstream Vera Sans;-inkscape-font-specification:Bitstream Vera Sans Oblique">Video Track</tspan></text>
<text
xml:space="preserve"
style="font-size:40px;font-style:normal;font-weight:normal;line-height:125%;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="45"
y="454.09448"
id="text4416"
sodipodi:linespacing="125%"><tspan
sodipodi:role="line"
id="tspan4418"
x="45"
y="454.09448"
style="font-size:20px;font-style:oblique;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:start;line-height:125%;writing-mode:lr-tb;text-anchor:start;font-family:Bitstream Vera Sans;-inkscape-font-specification:Bitstream Vera Sans Oblique">Audio Track</tspan></text>
<text
xml:space="preserve"
style="font-size:40px;font-style:normal;font-weight:normal;line-height:125%;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Bitstream Vera Sans"
x="43.398438"
y="102.42651"
id="text4420"
sodipodi:linespacing="125%"><tspan
sodipodi:role="line"
id="tspan4422"
x="43.398438"
y="102.42651"
style="font-size:20px;font-style:oblique;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:start;line-height:125%;writing-mode:lr-tb;text-anchor:start;font-family:Bitstream Vera Sans;-inkscape-font-specification:Bitstream Vera Sans Oblique">Layer</tspan></text>
<text
xml:space="preserve"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:bold;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Georgia;-inkscape-font-specification:Georgia Bold Italic"
x="139.73431"
y="370.0614"
id="text4424"
sodipodi:linespacing="120%"><tspan
sodipodi:role="line"
id="tspan4426"
x="139.73431"
y="370.0614"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic">Source A</tspan></text>
<text
xml:space="preserve"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:bold;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Georgia;-inkscape-font-specification:Georgia Bold Italic"
x="139.73431"
y="503.18912"
id="text4428"
sodipodi:linespacing="120%"><tspan
sodipodi:role="line"
id="tspan4430"
x="139.73431"
y="503.18912"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic">Source A</tspan></text>
<text
xml:space="preserve"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:bold;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Georgia;-inkscape-font-specification:Georgia Bold Italic"
x="284.57486"
y="358.20789"
id="text4432"
sodipodi:linespacing="120%"><tspan
sodipodi:role="line"
id="tspan4434"
x="284.57486"
y="358.20789"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic">Source</tspan><tspan
sodipodi:role="line"
x="284.57486"
y="382.20789"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic"
id="tspan4450">B</tspan></text>
<text
xml:space="preserve"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:bold;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Georgia;-inkscape-font-specification:Georgia Bold Italic"
x="286.56216"
y="302.49377"
id="text4436"
sodipodi:linespacing="120%"><tspan
sodipodi:role="line"
id="tspan4438"
x="286.56216"
y="302.49377"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic">BaseEffect</tspan></text>
<text
xml:space="preserve"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:bold;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Georgia;-inkscape-font-specification:Georgia Bold Italic"
x="428.78125"
y="370.09448"
id="text4440"
sodipodi:linespacing="120%"><tspan
sodipodi:role="line"
id="tspan4442"
x="428.78125"
y="370.09448"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic">Source C</tspan></text>
<text
xml:space="preserve"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:bold;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Georgia;-inkscape-font-specification:Georgia Bold Italic"
x="284.57486"
y="491.3356"
id="text4444"
sodipodi:linespacing="120%"><tspan
sodipodi:role="line"
id="tspan4446"
x="284.57486"
y="491.3356"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic">Source</tspan><tspan
sodipodi:role="line"
x="284.57486"
y="515.33563"
style="font-size:20px;font-style:italic;font-variant:normal;font-weight:normal;font-stretch:normal;text-align:center;line-height:120.00000477%;writing-mode:lr-tb;text-anchor:middle;font-family:Georgia;-inkscape-font-specification:Georgia Italic"
id="tspan4448">B</tspan></text>
<path
style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:1.97574663;stroke-linecap:butt;stroke-linejoin:miter;marker-end:url(#Arrow2Lend);stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
d="M 529.9953,329.07761 L 588.00008,329.07761"
id="path4456"
inkscape:export-filename="/home/bilboed/work/devel/gst-editing-services/docs/libs/layer_track_overview.png"
inkscape:export-xdpi="149.87575"
inkscape:export-ydpi="149.87575" />
</g>
</svg>

After

Width:  |  Height:  |  Size: 29 KiB

69
examples/c/assets.c Normal file
View file

@ -0,0 +1,69 @@
/* GStreamer Editing Services
* Copyright (C) 2012 Volodymyr Rudyi<vladimir.rudoy@gmail.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 59 Temple Place - Suite 330,
* Boston, MA 02111-1307, USA.
*/
#include <ges/ges.h>
#include <ges/ges-uri-asset.h>
#include <gst/pbutils/encoding-profile.h>
#include <gst/pbutils/gstdiscoverer.h>
static void
asset_loaded_cb (GObject * source, GAsyncResult * res, GMainLoop * mainloop)
{
GESUriClipAsset *mfs =
GES_URI_CLIP_ASSET (ges_asset_request_finish (res, NULL));
GstDiscovererInfo *discoverer_info = NULL;
discoverer_info = ges_uri_clip_asset_get_info (mfs);
GST_DEBUG ("Result is %d", gst_discoverer_info_get_result (discoverer_info));
GST_DEBUG ("Info type is %s", G_OBJECT_TYPE_NAME (mfs));
GST_DEBUG ("Duration is %" GST_TIME_FORMAT,
GST_TIME_ARGS (ges_uri_clip_asset_get_duration (mfs)));
gst_object_unref (mfs);
g_main_loop_quit (mainloop);
}
int
main (int argc, gchar ** argv)
{
GMainLoop *mainloop;
if (argc != 2) {
return 1;
}
/* Initialize GStreamer (this will parse environment variables and commandline
* arguments. */
gst_init (NULL, NULL);
/* Initialize the GStreamer Editing Services */
ges_init ();
/* ... and we start a GMainLoop. GES **REQUIRES** a GMainLoop to be running in
* order to function properly ! */
mainloop = g_main_loop_new (NULL, FALSE);
ges_asset_request_async (GES_TYPE_URI_CLIP, argv[1], NULL,
(GAsyncReadyCallback) asset_loaded_cb, mainloop);
g_main_loop_run (mainloop);
g_main_loop_unref (mainloop);
return 0;
}

193
examples/c/concatenate.c Normal file
View file

@ -0,0 +1,193 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Edward Hervey <bilboed@bilboed.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <gio/gio.h>
#include <ges/ges.h>
#include <gst/pbutils/gstdiscoverer.h>
#include <gst/pbutils/encoding-profile.h>
static void
bus_message_cb (GstBus * bus, GstMessage * message, GMainLoop * mainloop);
static GstEncodingProfile *make_profile_from_info (GstDiscovererInfo * info);
GESLayer *layer = NULL;
GESPipeline *pipeline = NULL;
GESTimeline *timeline = NULL;
gchar *output_uri = NULL;
guint assetsCount = 0;
guint assetsLoaded = 0;
static void
asset_loaded_cb (GObject * source_object, GAsyncResult * res,
GMainLoop * mainloop)
{
GError *error = NULL;
guint64 duration = 0;
GESUriClipAsset *mfs =
GES_URI_CLIP_ASSET (ges_asset_request_finish (res, &error));
if (error) {
GST_WARNING ("error creating asset %s", error->message);
return;
}
duration = ges_uri_clip_asset_get_duration (mfs);
ges_layer_add_asset (layer,
GES_ASSET (source_object),
ges_timeline_get_duration (timeline),
0, duration, ges_clip_asset_get_supported_formats (GES_CLIP_ASSET (mfs)));
assetsLoaded++;
/*
* Check if we have loaded last asset and trigger concatenating
*/
if (assetsLoaded == assetsCount) {
GstDiscovererInfo *info = ges_uri_clip_asset_get_info (mfs);
GstEncodingProfile *profile = make_profile_from_info (info);
ges_pipeline_set_render_settings (pipeline, output_uri, profile);
/* We want the pipeline to render (without any preview) */
if (!ges_pipeline_set_mode (pipeline, GES_PIPELINE_MODE_SMART_RENDER)) {
g_main_loop_quit (mainloop);
return;
}
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
}
gst_object_unref (mfs);
}
int
main (int argc, char **argv)
{
GMainLoop *mainloop = NULL;
GESTimeline *timeline;
GstBus *bus = NULL;
guint i;
if (argc < 3) {
gst_print ("Usage: %s <output uri> <list of files>\n", argv[0]);
return -1;
}
gst_init (&argc, &argv);
ges_init ();
timeline = ges_timeline_new_audio_video ();
layer = (GESLayer *) ges_layer_new ();
if (!ges_timeline_add_layer (timeline, layer))
return -1;
output_uri = argv[1];
assetsCount = argc - 2;
for (i = 2; i < argc; i++) {
ges_asset_request_async (GES_TYPE_URI_CLIP, argv[i],
NULL, (GAsyncReadyCallback) asset_loaded_cb, mainloop);
}
/* In order to view our timeline, let's grab a convenience pipeline to put
* our timeline in. */
pipeline = ges_pipeline_new ();
/* Add the timeline to that pipeline */
if (!ges_pipeline_set_timeline (pipeline, timeline))
return -1;
mainloop = g_main_loop_new (NULL, FALSE);
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "message", G_CALLBACK (bus_message_cb), mainloop);
g_main_loop_run (mainloop);
return 0;
}
static void
bus_message_cb (GstBus * bus, GstMessage * message, GMainLoop * mainloop)
{
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR:
gst_print ("ERROR\n");
g_main_loop_quit (mainloop);
break;
case GST_MESSAGE_EOS:
gst_print ("Done\n");
g_main_loop_quit (mainloop);
break;
default:
break;
}
}
static GstEncodingProfile *
make_profile_from_info (GstDiscovererInfo * info)
{
GstEncodingContainerProfile *profile = NULL;
GstDiscovererStreamInfo *sinfo = gst_discoverer_info_get_stream_info (info);
/* Get the container format */
if (GST_IS_DISCOVERER_CONTAINER_INFO (sinfo)) {
GList *tmp, *substreams;
profile = gst_encoding_container_profile_new ((gchar *) "concatenate", NULL,
gst_discoverer_stream_info_get_caps (sinfo), NULL);
substreams =
gst_discoverer_container_info_get_streams ((GstDiscovererContainerInfo
*) sinfo);
/* For each on the formats add stream profiles */
for (tmp = substreams; tmp; tmp = tmp->next) {
GstDiscovererStreamInfo *stream = GST_DISCOVERER_STREAM_INFO (tmp->data);
GstEncodingProfile *sprof = NULL;
if (GST_IS_DISCOVERER_VIDEO_INFO (stream)) {
sprof = (GstEncodingProfile *)
gst_encoding_video_profile_new (gst_discoverer_stream_info_get_caps
(stream), NULL, NULL, 1);
} else if (GST_IS_DISCOVERER_AUDIO_INFO (stream)) {
sprof = (GstEncodingProfile *)
gst_encoding_audio_profile_new (gst_discoverer_stream_info_get_caps
(stream), NULL, NULL, 1);
} else {
GST_WARNING ("Unsupported streams");
}
if (sprof)
gst_encoding_container_profile_add_profile (profile, sprof);
}
if (substreams)
gst_discoverer_stream_info_list_free (substreams);
} else {
GST_ERROR ("No container format !!!");
}
if (sinfo)
gst_discoverer_stream_info_unref (sinfo);
return GST_ENCODING_PROFILE (profile);
}

1703
examples/c/ges-ui.c Normal file

File diff suppressed because it is too large Load diff

1143
examples/c/ges-ui.glade Normal file

File diff suppressed because it is too large Load diff

124
examples/c/gessrc.c Normal file
View file

@ -0,0 +1,124 @@
/* GStreamer GES plugin
*
* Copyright (C) 2019 Igalia S.L
* Author: 2019 Thibault Saunier <tsaunier@igalia.com>
*
* gesdemux.c
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <ges/ges.h>
static void
bus_message_cb (GstBus * bus, GstMessage * message, GMainLoop * mainloop)
{
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR:
gst_printerr ("Got error message on the bus\n");
g_main_loop_quit (mainloop);
break;
case GST_MESSAGE_EOS:
gst_print ("Done\n");
g_main_loop_quit (mainloop);
break;
default:
break;
}
}
static void
source_setup_cb (GstElement * playbin, GstElement * source,
GESTimeline * timeline)
{
g_object_set (source, "timeline", timeline, NULL);
}
int
main (int argc, char **argv)
{
GMainLoop *mainloop = NULL;
GstElement *pipeline = NULL;
GESTimeline *timeline;
GESLayer *layer = NULL;
GstBus *bus = NULL;
guint i, ret = 0;
gchar *uri = NULL;
GstClockTime start = 0;
if (argc < 2) {
gst_print ("Usage: %s <list of files>\n", argv[0]);
return -1;
}
gst_init (&argc, &argv);
ges_init ();
timeline = ges_timeline_new_audio_video ();
layer = (GESLayer *) ges_layer_new ();
if (!ges_timeline_add_layer (timeline, layer))
return -1;
/* Build the timeline */
for (i = 1; i < argc; i++) {
GESClip *clip;
uri = g_strdup (argv[i]);
if (!gst_uri_is_valid (uri)) {
g_free (uri);
uri = gst_filename_to_uri (argv[i], NULL);
}
clip = GES_CLIP (ges_uri_clip_new (uri));
if (!clip) {
gst_printerr ("Could not create clip for file: %s\n", argv[i]);
g_free (uri);
goto err;
}
g_object_set (clip, "start", start, NULL);
ges_layer_add_clip (layer, clip);
start += ges_timeline_element_get_duration (GES_TIMELINE_ELEMENT (clip));
g_free (uri);
}
/* Use a usual playbin pipeline */
pipeline = gst_element_factory_make ("playbin", NULL);
g_object_set (pipeline, "uri", "ges://", NULL);
g_signal_connect (pipeline, "source-setup", G_CALLBACK (source_setup_cb),
timeline);
mainloop = g_main_loop_new (NULL, FALSE);
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "message", G_CALLBACK (bus_message_cb), mainloop);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
g_main_loop_run (mainloop);
gst_element_set_state (pipeline, GST_STATE_NULL);
done:
gst_clear_object (&pipeline);
if (mainloop)
g_main_loop_unref (mainloop);
return ret;
err:
goto done;
}

30
examples/c/meson.build Normal file
View file

@ -0,0 +1,30 @@
examples = [
'concatenate',
'gessrc',
'simple1',
'test1',
'test2',
'test3',
'test4',
'transition',
'thumbnails',
'overlays',
'text_properties',
'assets',
'multifilesrc',
'play_timeline_with_one_clip'
]
# TODO Properly port to Gtk 3
#
# if gtk_dep.found()
# examples = examples + ['ges-ui']
# endif
foreach example_name : examples
exe = executable(example_name, '@0@.c'.format(example_name),
c_args : ges_c_args,
dependencies : libges_deps + [ges_dep],
)
endforeach

94
examples/c/multifilesrc.c Normal file
View file

@ -0,0 +1,94 @@
/* GStreamer Editing Services
* Copyright (C) 2013 Lubosz Sarnecki <lubosz@gmail.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <stdlib.h>
#include <ges/ges.h>
/* A image sequence test */
int
main (int argc, gchar ** argv)
{
GError *err = NULL;
GOptionContext *ctx;
GESPipeline *pipeline;
GESTimeline *timeline;
GESAsset *asset;
GESLayer *layer;
GMainLoop *mainloop;
GESTrack *track;
gint duration = 10;
gchar *filepattern = NULL;
GOptionEntry options[] = {
{"duration", 'd', 0, G_OPTION_ARG_INT, &duration,
"duration to use from the file (in seconds, default:10s)", "seconds"},
{"pattern-url", 'u', 0, G_OPTION_ARG_FILENAME, &filepattern,
"Pattern of the files. i.e. multifile:///foo/%04d.jpg",
"pattern-url"},
{NULL}
};
ctx = g_option_context_new ("- Plays an image sequence");
g_option_context_add_main_entries (ctx, options, NULL);
g_option_context_add_group (ctx, gst_init_get_option_group ());
if (!g_option_context_parse (ctx, &argc, &argv, &err)) {
gst_print ("Error initializing %s\n", err->message);
g_option_context_free (ctx);
g_clear_error (&err);
exit (1);
}
if (filepattern == NULL) {
gst_print ("%s", g_option_context_get_help (ctx, TRUE, NULL));
exit (0);
}
g_option_context_free (ctx);
gst_init (&argc, &argv);
ges_init ();
timeline = ges_timeline_new ();
track = GES_TRACK (ges_video_track_new ());
ges_timeline_add_track (timeline, track);
layer = ges_layer_new ();
if (!ges_timeline_add_layer (timeline, layer))
return -1;
asset = GES_ASSET (ges_uri_clip_asset_request_sync (filepattern, &err));
ges_layer_add_asset (layer, asset, 0, 0, 5 * GST_SECOND,
GES_TRACK_TYPE_VIDEO);
pipeline = ges_pipeline_new ();
if (!ges_pipeline_set_timeline (pipeline, timeline))
return -1;
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
mainloop = g_main_loop_new (NULL, FALSE);
g_timeout_add_seconds (4, (GSourceFunc) g_main_loop_quit, mainloop);
g_main_loop_run (mainloop);
return 0;
}

178
examples/c/overlays.c Normal file
View file

@ -0,0 +1,178 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Brandon Lewis <brandon@alum.berkeley.edu>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <stdlib.h>
#include <ges/ges.h>
typedef struct
{
int type;
char *name;
} transition_type;
GESClip *make_source (char *path, guint64 start, guint64 duration,
gint priority);
GESClip *make_overlay (char *text, guint64 start, guint64 duration,
gint priority, guint32 color, gdouble xpos, gdouble ypos);
GESPipeline *make_timeline (char *path, float duration, char *text,
guint32 color, gdouble xpos, gdouble ypos);
#define DEFAULT_DURATION 5
#define DEFAULT_POS 0.5
GESClip *
make_source (char *path, guint64 start, guint64 duration, gint priority)
{
gchar *uri = gst_filename_to_uri (path, NULL);
GESClip *ret = GES_CLIP (ges_uri_clip_new (uri));
g_object_set (ret,
"start", (guint64) start,
"duration", (guint64) duration,
"priority", (guint32) priority, "in-point", (guint64) 0, NULL);
g_free (uri);
return ret;
}
GESClip *
make_overlay (char *text, guint64 start, guint64 duration, gint priority,
guint32 color, gdouble xpos, gdouble ypos)
{
GESClip *ret = GES_CLIP (ges_text_overlay_clip_new ());
g_object_set (ret,
"text", (gchar *) text,
"start", (guint64) start,
"duration", (guint64) duration,
"priority", (guint32) priority,
"in-point", (guint64) 0,
"color", (guint32) color,
"valignment", (gint) GES_TEXT_VALIGN_POSITION,
"halignment", (gint) GES_TEXT_HALIGN_POSITION,
"xpos", (gdouble) xpos, "ypos", (gdouble) ypos, NULL);
return ret;
}
GESPipeline *
make_timeline (char *path, float duration, char *text, guint32 color,
gdouble xpos, gdouble ypos)
{
GESTimeline *timeline;
GESTrack *trackv, *tracka;
GESLayer *layer1;
GESClip *srca;
GESClip *overlay;
GESPipeline *pipeline;
guint64 aduration;
pipeline = ges_pipeline_new ();
ges_pipeline_set_mode (pipeline, GES_PIPELINE_MODE_PREVIEW_VIDEO);
timeline = ges_timeline_new ();
ges_pipeline_set_timeline (pipeline, timeline);
trackv = GES_TRACK (ges_video_track_new ());
ges_timeline_add_track (timeline, trackv);
tracka = GES_TRACK (ges_audio_track_new ());
ges_timeline_add_track (timeline, tracka);
layer1 = GES_LAYER (ges_layer_new ());
g_object_set (layer1, "priority", (gint32) 0, NULL);
if (!ges_timeline_add_layer (timeline, layer1))
exit (-1);
aduration = (guint64) (duration * GST_SECOND);
srca = make_source (path, 0, aduration, 1);
overlay = make_overlay (text, 0, aduration, 0, color, xpos, ypos);
ges_layer_add_clip (layer1, srca);
ges_layer_add_clip (layer1, overlay);
return pipeline;
}
int
main (int argc, char **argv)
{
GError *err = NULL;
GOptionContext *ctx;
GESPipeline *pipeline;
GMainLoop *mainloop;
gdouble duration = DEFAULT_DURATION;
char *path = NULL, *text;
guint64 color;
gdouble xpos = DEFAULT_POS, ypos = DEFAULT_POS;
GOptionEntry options[] = {
{"duration", 'd', 0, G_OPTION_ARG_DOUBLE, &duration,
"duration of segment", "seconds"},
{"path", 'p', 0, G_OPTION_ARG_STRING, &path,
"path to file", "path"},
{"text", 't', 0, G_OPTION_ARG_STRING, &text,
"text to render", "text"},
{"color", 'c', 0, G_OPTION_ARG_INT64, &color,
"color of the text", "color"},
{"xpos", 'x', 0, G_OPTION_ARG_DOUBLE, &xpos,
"horizontal position of the text", "color"},
{"ypos", 'y', 0, G_OPTION_ARG_DOUBLE, &ypos,
"vertical position of the text", "color"},
{NULL}
};
ctx = g_option_context_new ("- file segment playback with text overlay");
g_option_context_add_main_entries (ctx, options, NULL);
g_option_context_add_group (ctx, gst_init_get_option_group ());
if (!g_option_context_parse (ctx, &argc, &argv, &err)) {
gst_print ("Error initializing %s\n", err->message);
g_option_context_free (ctx);
g_clear_error (&err);
exit (1);
}
if (argc > 1) {
gst_print ("%s", g_option_context_get_help (ctx, TRUE, NULL));
exit (0);
}
g_option_context_free (ctx);
ges_init ();
if (path == NULL)
g_error ("Must specify --path=/path/to/media/file option\n");
pipeline = make_timeline (path, duration, text, color, xpos, ypos);
mainloop = g_main_loop_new (NULL, FALSE);
g_timeout_add_seconds ((duration) + 1, (GSourceFunc) g_main_loop_quit,
mainloop);
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
g_main_loop_run (mainloop);
return 0;
}

View file

@ -0,0 +1,63 @@
/* This example can be found in the GStreamer Editing Services git repository in:
* examples/c/play_timeline_with_one_clip.c
*/
#include <ges/ges.h>
int
main (int argc, char **argv)
{
GESLayer *layer;
GESTimeline *timeline;
if (argc == 1) {
gst_printerr ("Usage: play_timeline_with_one_clip file:///clip/uri\n");
return 1;
}
gst_init (NULL, NULL);
ges_init ();
timeline = ges_timeline_new_audio_video ();
layer = ges_timeline_append_layer (timeline);
{
/* Add a clip with a duration of 5 seconds */
GESClip *clip = GES_CLIP (ges_uri_clip_new (argv[1]));
if (clip == NULL) {
gst_printerr
("%s can not be used, make sure it is a supported media file",
argv[1]);
return 1;
}
g_object_set (clip, "duration", 5 * GST_SECOND, "start", 0, NULL);
ges_layer_add_clip (layer, clip);
}
/* Commiting the timeline is always necessary for changes
* inside it to be taken into account by the Non Linear Engine */
ges_timeline_commit (timeline);
{
/* Play the timeline */
GESPipeline *pipeline = ges_pipeline_new ();
GstBus *bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
ges_pipeline_set_timeline (pipeline, timeline);
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
/* Simple way to just play the pipeline until EOS or an error pops on the bus */
gst_bus_timed_pop_filtered (bus, 10 * GST_SECOND,
GST_MESSAGE_EOS | GST_MESSAGE_ERROR);
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_NULL);
gst_object_unref (bus);
gst_object_unref (pipeline);
}
return 0;
}

99
examples/c/simple1.c Normal file
View file

@ -0,0 +1,99 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Edward Hervey <bilboed@bilboed.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <stdlib.h>
#include <ges/ges.h>
#include <stdlib.h>
int
main (int argc, gchar ** argv)
{
GError *err = NULL;
GOptionContext *ctx;
GESPipeline *pipeline;
GESTimeline *timeline;
GESLayer *layer1;
GESUriClip *src;
gchar *uri;
GMainLoop *mainloop;
gint inpoint = 0, duration = 10;
gboolean mute = FALSE;
gchar *audiofile = NULL;
GOptionEntry options[] = {
{"inpoint", 'i', 0, G_OPTION_ARG_INT, &inpoint,
"in-point in the file (in seconds, default:0s)", "seconds"},
{"duration", 'd', 0, G_OPTION_ARG_INT, &duration,
"duration to use from the file (in seconds, default:10s)", "seconds"},
{"mute", 'm', 0, G_OPTION_ARG_NONE, &mute,
"Whether to mute the audio from the file",},
{"audiofile", 'a', 0, G_OPTION_ARG_FILENAME, &audiofile,
"Use this audiofile instead of the original audio from the file",
"audiofile"},
{NULL}
};
ctx =
g_option_context_new
("- Plays an video file with sound (origin/muted/replaced)");
g_option_context_add_main_entries (ctx, options, NULL);
g_option_context_add_group (ctx, gst_init_get_option_group ());
if (!g_option_context_parse (ctx, &argc, &argv, &err)) {
gst_print ("Error initializing %s\n", err->message);
g_option_context_free (ctx);
g_clear_error (&err);
exit (1);
}
if (argc == 1) {
gst_print ("%s", g_option_context_get_help (ctx, TRUE, NULL));
exit (0);
}
g_option_context_free (ctx);
ges_init ();
/* Create an Audio/Video pipeline with two layers */
pipeline = ges_pipeline_new ();
timeline = ges_timeline_new_audio_video ();
layer1 = ges_timeline_append_layer (timeline);
if (!ges_pipeline_set_timeline (pipeline, timeline)) {
g_error ("Could not set timeline to pipeline");
return -1;
}
uri = gst_filename_to_uri (argv[1], NULL);
/* Add the main audio/video file */
src = ges_uri_clip_new (uri);
ges_layer_add_clip (layer1, GES_CLIP (src));
g_free (uri);
g_object_set (src, "start", 0, "in-point", inpoint * GST_SECOND,
"duration", duration * GST_SECOND, "mute", mute, NULL);
/* Play the pipeline */
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
mainloop = g_main_loop_new (NULL, FALSE);
g_timeout_add_seconds (duration + 1, (GSourceFunc) g_main_loop_quit,
mainloop);
g_main_loop_run (mainloop);
return 0;
}

88
examples/c/test1.c Normal file
View file

@ -0,0 +1,88 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <bilboed@bilboed.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <ges/ges.h>
/* A simple timeline with 3 audio/video sources */
int
main (int argc, gchar ** argv)
{
GESAsset *src_asset;
GESPipeline *pipeline;
GESTimeline *timeline;
GESClip *source;
GESLayer *layer;
GMainLoop *mainloop;
/* Initialize GStreamer (this will parse environment variables and commandline
* arguments. */
gst_init (&argc, &argv);
/* Initialize the GStreamer Editing Services */
ges_init ();
/* Setup of a A/V timeline */
/* This is our main GESTimeline */
timeline = ges_timeline_new_audio_video ();
/* We are only going to be doing one layer of clips */
layer = ges_layer_new ();
/* Add the tracks and the layer to the timeline */
if (!ges_timeline_add_layer (timeline, layer))
return -1;
/* We create a simple asset able to extract GESTestClip */
src_asset = ges_asset_request (GES_TYPE_TEST_CLIP, NULL, NULL);
/* Add sources to our layer */
ges_layer_add_asset (layer, src_asset, 0, 0, GST_SECOND,
GES_TRACK_TYPE_UNKNOWN);
source = ges_layer_add_asset (layer, src_asset, GST_SECOND, 0,
GST_SECOND, GES_TRACK_TYPE_UNKNOWN);
g_object_set (source, "freq", 480.0, "vpattern", 2, NULL);
ges_layer_add_asset (layer, src_asset, 2 * GST_SECOND, 0,
GST_SECOND, GES_TRACK_TYPE_UNKNOWN);
/* In order to view our timeline, let's grab a convenience pipeline to put
* our timeline in. */
pipeline = ges_pipeline_new ();
/* Add the timeline to that pipeline */
if (!ges_pipeline_set_timeline (pipeline, timeline))
return -1;
/* The following is standard usage of a GStreamer pipeline (note how you haven't
* had to care about GStreamer so far ?).
*
* We set the pipeline to playing ... */
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
/* .. and we start a GMainLoop. GES **REQUIRES** a GMainLoop to be running in
* order to function properly ! */
mainloop = g_main_loop_new (NULL, FALSE);
/* Simple code to have the mainloop shutdown after 4s */
g_timeout_add_seconds (4, (GSourceFunc) g_main_loop_quit, mainloop);
g_main_loop_run (mainloop);
return 0;
}

99
examples/c/test2.c Normal file
View file

@ -0,0 +1,99 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <bilboed@bilboed.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <ges/ges.h>
int
main (int argc, gchar ** argv)
{
GESPipeline *pipeline;
GESTimeline *timeline;
GESTrack *tracka;
GESLayer *layer;
GMainLoop *mainloop;
GstClockTime offset = 0;
guint i;
if (argc < 2) {
gst_print ("Usage: %s <list of audio files>\n", argv[0]);
return -1;
}
/* Initialize GStreamer (this will parse environment variables and commandline
* arguments. */
gst_init (&argc, &argv);
/* Initialize the GStreamer Editing Services */
ges_init ();
/* Setup of an audio timeline */
/* This is our main GESTimeline */
timeline = ges_timeline_new ();
tracka = GES_TRACK (ges_audio_track_new ());
/* We are only going to be doing one layer of clips */
layer = ges_layer_new ();
/* Add the tracks and the layer to the timeline */
if (!ges_timeline_add_layer (timeline, layer))
return -1;
if (!ges_timeline_add_track (timeline, tracka))
return -1;
/* Here we've finished initializing our timeline, we're
* ready to start using it... by solely working with the layer ! */
for (i = 1; i < argc; i++, offset += GST_SECOND) {
gchar *uri = gst_filename_to_uri (argv[i], NULL);
GESUriClip *src = ges_uri_clip_new (uri);
g_assert (src);
g_free (uri);
g_object_set (src, "start", offset, "duration", GST_SECOND, NULL);
ges_layer_add_clip (layer, (GESClip *) src);
}
/* In order to listen our timeline, let's grab a convenience pipeline to put
* our timeline in. */
pipeline = ges_pipeline_new ();
/* Add the timeline to that pipeline */
if (!ges_pipeline_set_timeline (pipeline, timeline))
return -1;
/* The following is standard usage of a GStreamer pipeline (note how you
* haven't had to care about GStreamer so far ?).
*
* We set the pipeline to playing ... */
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
/* ... and we start a GMainLoop. GES **REQUIRES** a GMainLoop to be running in
* order to function properly ! */
mainloop = g_main_loop_new (NULL, FALSE);
/* Simple code to have the mainloop shutdown after 4s */
g_timeout_add_seconds (argc - 1, (GSourceFunc) g_main_loop_quit, mainloop);
g_main_loop_run (mainloop);
return 0;
}

98
examples/c/test3.c Normal file
View file

@ -0,0 +1,98 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <bilboed@bilboed.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <ges/ges.h>
int
main (int argc, gchar ** argv)
{
GESPipeline *pipeline;
GESTimeline *timeline;
GESTrack *tracka;
GESLayer *layer;
GMainLoop *mainloop;
guint i;
if (argc < 2) {
gst_print ("Usage: %s <list of audio files>\n", argv[0]);
return -1;
}
/* Initialize GStreamer (this will parse environment variables and commandline
* arguments. */
gst_init (&argc, &argv);
/* Initialize the GStreamer Editing Services */
ges_init ();
/* Setup of an audio timeline */
/* This is our main GESTimeline */
timeline = ges_timeline_new ();
tracka = GES_TRACK (ges_audio_track_new ());
/* We are only going to be doing one layer of clips */
layer = ges_layer_new ();
/* Add the tracks and the layer to the timeline */
if (!ges_timeline_add_layer (timeline, layer))
return -1;
if (!ges_timeline_add_track (timeline, tracka))
return -1;
/* Here we've finished initializing our timeline, we're
* ready to start using it... by solely working with the layer ! */
for (i = 1; i < argc; i++) {
gchar *uri = gst_filename_to_uri (argv[i], NULL);
GESUriClip *src = ges_uri_clip_new (uri);
g_assert (src);
g_free (uri);
g_object_set (src, "start", ges_layer_get_duration (layer),
"duration", GST_SECOND, NULL);
ges_layer_add_clip (layer, (GESClip *) src);
}
/* In order to view our timeline, let's grab a convenience pipeline to put
* our timeline in. */
pipeline = ges_pipeline_new ();
/* Add the timeline to that pipeline */
if (!ges_pipeline_set_timeline (pipeline, timeline))
return -1;
/* The following is standard usage of a GStreamer pipeline (note how you haven't
* had to care about GStreamer so far ?).
*
* We set the pipeline to playing ... */
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
/* .. and we start a GMainLoop. GES **REQUIRES** a GMainLoop to be running in
* order to function properly ! */
mainloop = g_main_loop_new (NULL, FALSE);
/* Simple code to have the mainloop shutdown after 4s */
g_timeout_add_seconds (argc - 1, (GSourceFunc) g_main_loop_quit, mainloop);
g_main_loop_run (mainloop);
return 0;
}

179
examples/c/test4.c Normal file
View file

@ -0,0 +1,179 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <bilboed@bilboed.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <ges/ges.h>
#include <gst/pbutils/encoding-profile.h>
GstEncodingProfile *make_encoding_profile (gchar * audio, gchar * container);
/* This example will take a series of files and create a audio-only timeline
* containing the first second of each file and render it to the output uri
* using ogg/vorbis */
/* make_encoding_profile
* simple method creating an encoding profile. This is here in
* order not to clutter the main function. */
GstEncodingProfile *
make_encoding_profile (gchar * audio, gchar * container)
{
GstEncodingContainerProfile *profile;
GstEncodingProfile *stream;
GstCaps *caps;
caps = gst_caps_from_string (container);
profile =
gst_encoding_container_profile_new ((gchar *) "ges-test4", NULL, caps,
NULL);
gst_caps_unref (caps);
caps = gst_caps_from_string (audio);
stream = (GstEncodingProfile *)
gst_encoding_audio_profile_new (caps, NULL, NULL, 0);
gst_encoding_container_profile_add_profile (profile, stream);
gst_caps_unref (caps);
return (GstEncodingProfile *) profile;
}
int
main (int argc, gchar ** argv)
{
GESPipeline *pipeline;
GESTimeline *timeline;
GESTrack *tracka;
GESLayer *layer;
GMainLoop *mainloop;
GstEncodingProfile *profile;
gchar *container = (gchar *) "application/ogg";
gchar *audio = (gchar *) "audio/x-vorbis";
gchar *output_uri;
guint i;
GError *err = NULL;
GOptionEntry options[] = {
{"format", 'f', 0, G_OPTION_ARG_STRING, &container,
"Container format", "<GstCaps>"},
{"aformat", 'a', 0, G_OPTION_ARG_STRING, &audio,
"Audio format", "<GstCaps>"},
{NULL}
};
GOptionContext *ctx;
ctx = g_option_context_new ("- renders a sequence of audio files.");
g_option_context_add_main_entries (ctx, options, NULL);
g_option_context_add_group (ctx, gst_init_get_option_group ());
if (!g_option_context_parse (ctx, &argc, &argv, &err)) {
gst_printerr ("Error initializing: %s\n", err->message);
g_option_context_free (ctx);
g_clear_error (&err);
return -1;
}
g_option_context_free (ctx);
if (argc < 3) {
gst_print ("Usage: %s <output uri> <list of audio files>\n", argv[0]);
return -1;
}
/* Initialize GStreamer (this will parse environment variables and commandline
* arguments. */
gst_init (&argc, &argv);
/* Initialize the GStreamer Editing Services */
ges_init ();
/* Setup of an audio timeline */
/* This is our main GESTimeline */
timeline = ges_timeline_new ();
tracka = GES_TRACK (ges_audio_track_new ());
/* We are only going to be doing one layer of clips */
layer = ges_layer_new ();
/* Add the tracks and the layer to the timeline */
if (!ges_timeline_add_layer (timeline, layer))
return -1;
if (!ges_timeline_add_track (timeline, tracka))
return -1;
/* Here we've finished initializing our timeline, we're
* ready to start using it... by solely working with the layer ! */
for (i = 2; i < argc; i++) {
gchar *uri = gst_filename_to_uri (argv[i], NULL);
GESUriClip *src = ges_uri_clip_new (uri);
g_assert (src);
g_free (uri);
g_object_set (src, "start", ges_layer_get_duration (layer),
"duration", GST_SECOND, NULL);
/* Since we're using a GESSimpleLayer, objects will be automatically
* appended to the end of the layer */
ges_layer_add_clip (layer, (GESClip *) src);
}
/* In order to view our timeline, let's grab a convenience pipeline to put
* our timeline in. */
pipeline = ges_pipeline_new ();
/* Add the timeline to that pipeline */
if (!ges_pipeline_set_timeline (pipeline, timeline))
return -1;
/* RENDER SETTINGS ! */
/* We set our output URI and rendering setting on the pipeline */
if (gst_uri_is_valid (argv[1])) {
output_uri = g_strdup (argv[1]);
} else if (g_file_test (argv[1], G_FILE_TEST_EXISTS)) {
output_uri = gst_filename_to_uri (argv[1], NULL);
} else {
gst_printerr ("Unrecognised command line argument '%s'.\n"
"Please pass an URI or file as argument!\n", argv[1]);
return -1;
}
profile = make_encoding_profile (audio, container);
if (!ges_pipeline_set_render_settings (pipeline, output_uri, profile))
return -1;
/* We want the pipeline to render (without any preview) */
if (!ges_pipeline_set_mode (pipeline, GES_PIPELINE_MODE_SMART_RENDER))
return -1;
/* The following is standard usage of a GStreamer pipeline (note how you haven't
* had to care about GStreamer so far ?).
*
* We set the pipeline to playing ... */
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
/* ... and we start a GMainLoop. GES **REQUIRES** a GMainLoop to be running in
* order to function properly ! */
mainloop = g_main_loop_new (NULL, FALSE);
/* Simple code to have the mainloop shutdown after 4s */
/* FIXME : We should wait for EOS ! */
g_timeout_add_seconds (argc - 1, (GSourceFunc) g_main_loop_quit, mainloop);
g_main_loop_run (mainloop);
return 0;
}

View file

@ -0,0 +1,138 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Brandon Lewis <brandon@alum.berkeley.edu>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <stdlib.h>
#include <ges/ges.h>
typedef struct
{
int type;
char *name;
} transition_type;
GESClip *make_source (char *path, guint64 start, guint64 duration,
gint priority, gchar * text);
GESPipeline *make_timeline (char *path, float duration, char *text);
GESClip *
make_source (char *path, guint64 start, guint64 duration, gint priority,
gchar * text)
{
gchar *uri = gst_filename_to_uri (path, NULL);
GESClip *ret = GES_CLIP (ges_uri_clip_new (uri));
g_object_set (ret,
"start", (guint64) start,
"duration", (guint64) duration,
"priority", (guint32) priority, "in-point", (guint64) 0,
"text", text, NULL);
g_free (uri);
return ret;
}
GESPipeline *
make_timeline (char *path, float duration, char *text)
{
GESTimeline *timeline;
GESTrack *trackv, *tracka;
GESLayer *layer1;
GESClip *srca;
GESPipeline *pipeline;
guint64 aduration;
pipeline = ges_pipeline_new ();
ges_pipeline_set_mode (pipeline, GES_PIPELINE_MODE_PREVIEW_VIDEO);
timeline = ges_timeline_new ();
ges_pipeline_set_timeline (pipeline, timeline);
trackv = GES_TRACK (ges_video_track_new ());
ges_timeline_add_track (timeline, trackv);
tracka = GES_TRACK (ges_audio_track_new ());
ges_timeline_add_track (timeline, tracka);
layer1 = GES_LAYER (ges_layer_new ());
g_object_set (layer1, "priority", (gint32) 0, NULL);
if (!ges_timeline_add_layer (timeline, layer1))
exit (-1);
aduration = (guint64) (duration * GST_SECOND);
srca = make_source (path, 0, aduration, 1, text);
ges_layer_add_clip (layer1, srca);
return pipeline;
}
int
main (int argc, char **argv)
{
GError *err = NULL;
GOptionContext *ctx;
GESPipeline *pipeline;
GMainLoop *mainloop;
gdouble duration;
char *path, *text;
GOptionEntry options[] = {
{"duration", 'd', 0, G_OPTION_ARG_DOUBLE, &duration,
"duration of transition", "seconds"},
{"path", 'p', 0, G_OPTION_ARG_STRING, &path,
"path to file", "path"},
{"text", 't', 0, G_OPTION_ARG_STRING, &text,
"text to render", "text"},
{NULL}
};
ctx = g_option_context_new ("- transition between two media files");
g_option_context_add_main_entries (ctx, options, NULL);
g_option_context_add_group (ctx, gst_init_get_option_group ());
if (!g_option_context_parse (ctx, &argc, &argv, &err)) {
gst_print ("Error initializing %s\n", err->message);
g_option_context_free (ctx);
g_clear_error (&err);
exit (1);
}
if (argc > 1) {
gst_print ("%s", g_option_context_get_help (ctx, TRUE, NULL));
exit (0);
}
g_option_context_free (ctx);
ges_init ();
pipeline = make_timeline (path, duration, text);
mainloop = g_main_loop_new (NULL, FALSE);
g_timeout_add_seconds ((duration) + 1, (GSourceFunc) g_main_loop_quit,
mainloop);
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
g_main_loop_run (mainloop);
return 0;
}

191
examples/c/thumbnails.c Normal file
View file

@ -0,0 +1,191 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Edward Hervey <bilboed@bilboed.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <glib.h>
#include <glib/gstdio.h>
#include <ges/ges.h>
#include <gst/pbutils/encoding-profile.h>
/* GLOBAL VARIABLE */
static guint repeat = 0;
GESPipeline *pipeline = NULL;
static gboolean thumbnail_cb (gpointer pipeline);
#define TEST_PATH "test_thumbnail.jpg"
static gboolean
thumbnail_cb (gpointer user)
{
GstSample *b = NULL;
GstCaps *caps;
GESPipeline *p;
p = GES_PIPELINE (user);
caps = gst_caps_from_string ("image/jpeg");
GST_INFO ("getting thumbnails");
/* check raw rgb use-case with scaling */
b = ges_pipeline_get_thumbnail_rgb24 (p, 320, 240);
g_assert (b);
gst_sample_unref (b);
/* check encoding use-case from caps */
b = NULL;
b = ges_pipeline_get_thumbnail (p, caps);
g_assert (b);
gst_sample_unref (b);
g_assert (ges_pipeline_save_thumbnail (p, -1, -1, (gchar *)
"image/jpeg", (gchar *) TEST_PATH, NULL));
g_assert (g_file_test (TEST_PATH, G_FILE_TEST_EXISTS));
g_unlink (TEST_PATH);
gst_caps_unref (caps);
return FALSE;
}
static GESPipeline *
create_timeline (void)
{
GESPipeline *pipeline;
GESLayer *layer;
GESTrack *tracka, *trackv;
GESTimeline *timeline;
GESClip *src;
timeline = ges_timeline_new ();
tracka = GES_TRACK (ges_audio_track_new ());
trackv = GES_TRACK (ges_video_track_new ());
layer = ges_layer_new ();
/* Add the tracks and the layer to the timeline */
if (!ges_timeline_add_layer (timeline, layer) ||
!ges_timeline_add_track (timeline, tracka) ||
!ges_timeline_add_track (timeline, trackv))
return NULL;
/* Add the main audio/video file */
src = GES_CLIP (ges_test_clip_new ());
g_object_set (src,
"vpattern", GES_VIDEO_TEST_PATTERN_SNOW,
"start", 0, "duration", 10 * GST_SECOND, NULL);
ges_layer_add_clip (layer, GES_CLIP (src));
pipeline = ges_pipeline_new ();
if (!ges_pipeline_set_timeline (pipeline, timeline))
return NULL;
return pipeline;
}
static void
bus_message_cb (GstBus * bus, GstMessage * message, GMainLoop * mainloop)
{
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR:
gst_print ("ERROR\n");
g_main_loop_quit (mainloop);
break;
case GST_MESSAGE_EOS:
if (repeat > 0) {
gst_print ("Looping again\n");
/* No need to change state before */
gst_element_seek_simple (GST_ELEMENT (pipeline), GST_FORMAT_TIME,
GST_SEEK_FLAG_FLUSH, 0);
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
repeat -= 1;
} else {
gst_print ("Done\n");
g_main_loop_quit (mainloop);
}
break;
default:
break;
}
}
int
main (int argc, gchar ** argv)
{
GError *err = NULL;
GOptionEntry options[] = {
{NULL}
};
GOptionContext *ctx;
GMainLoop *mainloop;
GstBus *bus;
ctx = g_option_context_new ("tests thumbnail supoprt (produces no output)");
g_option_context_set_summary (ctx, "");
g_option_context_add_main_entries (ctx, options, NULL);
g_option_context_add_group (ctx, gst_init_get_option_group ());
if (!g_option_context_parse (ctx, &argc, &argv, &err)) {
gst_print ("Error initializing: %s\n", err->message);
g_option_context_free (ctx);
g_clear_error (&err);
exit (1);
}
g_option_context_free (ctx);
/* Initialize the GStreamer Editing Services */
ges_init ();
/* Create the pipeline */
pipeline = create_timeline ();
if (!pipeline)
exit (-1);
ges_pipeline_set_mode (pipeline, GES_PIPELINE_MODE_PREVIEW);
/* Play the pipeline */
mainloop = g_main_loop_new (NULL, FALSE);
gst_print ("thumbnailing every 1 seconds\n");
g_timeout_add (1000, thumbnail_cb, pipeline);
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "message", G_CALLBACK (bus_message_cb), mainloop);
if (gst_element_set_state (GST_ELEMENT (pipeline),
GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
gst_print ("Failed to start the encoding\n");
return 1;
}
g_main_loop_run (mainloop);
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_NULL);
gst_object_unref (pipeline);
return 0;
}

207
examples/c/transition.c Normal file
View file

@ -0,0 +1,207 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Brandon Lewis <brandon@alum.berkeley.edu>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#include <stdlib.h>
#include <ges/ges.h>
#include <stdlib.h>
typedef struct
{
int type;
char *name;
} transition_type;
GESClip *make_source (gchar * path, guint64 start, guint64 inpoint,
guint64 duration, gint priority);
gboolean print_transition_data (GESClip * tr);
GESPipeline *make_timeline (gchar * nick, double tdur, gchar * patha,
gfloat adur, gdouble ainpoint, gchar * pathb, gfloat bdur,
gdouble binpoint);
GESClip *
make_source (gchar * path, guint64 start, guint64 duration, guint64 inpoint,
gint priority)
{
gchar *uri = gst_filename_to_uri (path, NULL);
GESClip *ret = GES_CLIP (ges_uri_clip_new (uri));
g_object_set (ret,
"start", (guint64) start,
"duration", (guint64) duration,
"priority", (guint32) priority, "in-point", (guint64) inpoint, NULL);
g_free (uri);
return ret;
}
gboolean
print_transition_data (GESClip * tr)
{
GESTrackElement *trackelement;
GstElement *nleobj;
guint64 start, duration;
gint priority;
char *name;
GList *trackelements;
if (!tr)
return FALSE;
if (!(trackelements = GES_CONTAINER_CHILDREN (tr)))
return FALSE;
if (!(trackelement = GES_TRACK_ELEMENT (trackelements->data)))
return FALSE;
if (!(nleobj = ges_track_element_get_nleobject (trackelement)))
return FALSE;
g_object_get (nleobj, "start", &start, "duration", &duration,
"priority", &priority, "name", &name, NULL);
gst_print ("nleobject for %s: %f %f %d\n", name,
((gfloat) start) / GST_SECOND,
((gfloat) duration) / GST_SECOND, priority);
return FALSE;
}
GESPipeline *
make_timeline (gchar * nick, gdouble tdur, gchar * patha, gfloat adur,
gdouble ainp, gchar * pathb, gfloat bdur, gdouble binp)
{
GESTimeline *timeline;
GESTrack *trackv, *tracka;
GESLayer *layer1;
GESClip *srca, *srcb;
GESPipeline *pipeline;
guint64 aduration, bduration, tduration, tstart, ainpoint, binpoint;
GESTransitionClip *tr = NULL;
pipeline = ges_pipeline_new ();
ges_pipeline_set_mode (pipeline, GES_PIPELINE_MODE_PREVIEW_VIDEO);
timeline = ges_timeline_new ();
ges_pipeline_set_timeline (pipeline, timeline);
trackv = GES_TRACK (ges_video_track_new ());
ges_timeline_add_track (timeline, trackv);
tracka = GES_TRACK (ges_audio_track_new ());
ges_timeline_add_track (timeline, tracka);
layer1 = GES_LAYER (ges_layer_new ());
g_object_set (layer1, "priority", (gint32) 0, NULL);
if (!ges_timeline_add_layer (timeline, layer1))
exit (-1);
aduration = (guint64) (adur * GST_SECOND);
bduration = (guint64) (bdur * GST_SECOND);
tduration = (guint64) (tdur * GST_SECOND);
ainpoint = (guint64) (ainp * GST_SECOND);
binpoint = (guint64) (binp * GST_SECOND);
tstart = aduration - tduration;
srca = make_source (patha, 0, aduration, ainpoint, 1);
srcb = make_source (pathb, tstart, bduration, binpoint, 2);
ges_layer_add_clip (layer1, srca);
ges_layer_add_clip (layer1, srcb);
g_timeout_add_seconds (1, (GSourceFunc) print_transition_data, srca);
g_timeout_add_seconds (1, (GSourceFunc) print_transition_data, srcb);
if (tduration != 0) {
gst_print ("creating transition at %" GST_TIME_FORMAT " of %f duration (%"
GST_TIME_FORMAT ")\n", GST_TIME_ARGS (tstart), tdur,
GST_TIME_ARGS (tduration));
if (!(tr = ges_transition_clip_new_for_nick (nick)))
g_error ("invalid transition type %s\n", nick);
g_object_set (tr,
"start", (guint64) tstart,
"duration", (guint64) tduration, "in-point", (guint64) 0, NULL);
ges_layer_add_clip (layer1, GES_CLIP (tr));
g_timeout_add_seconds (1, (GSourceFunc) print_transition_data, tr);
}
return pipeline;
}
int
main (int argc, char **argv)
{
GError *err = NULL;
GOptionContext *ctx;
GESPipeline *pipeline;
GMainLoop *mainloop;
gchar *type = (gchar *) "crossfade";
gchar *patha, *pathb;
gdouble adur, bdur, tdur, ainpoint, binpoint;
GOptionEntry options[] = {
{"type", 't', 0, G_OPTION_ARG_STRING, &type,
"type of transition to create", "<smpte-transition>"},
{"duration", 'd', 0, G_OPTION_ARG_DOUBLE, &tdur,
"duration of transition", "seconds"},
{NULL}
};
ctx = g_option_context_new ("- transition between two media files");
g_option_context_set_summary (ctx,
"Select two files, and optionally a transition duration and type.\n"
"A file is a triplet of filename, inpoint (in seconds) and duration (in seconds).\n"
"Example:\n" "transition file1.avi 0 5 file2.avi 25 5 -d 2 -t crossfade");
g_option_context_add_main_entries (ctx, options, NULL);
g_option_context_add_group (ctx, gst_init_get_option_group ());
if (!g_option_context_parse (ctx, &argc, &argv, &err)) {
gst_print ("Error initializing %s\n", err->message);
g_option_context_free (ctx);
g_clear_error (&err);
exit (1);
}
if (argc < 4) {
gst_print ("%s", g_option_context_get_help (ctx, TRUE, NULL));
exit (0);
}
g_option_context_free (ctx);
ges_init ();
patha = argv[1];
ainpoint = (gdouble) atof (argv[2]);
adur = (gdouble) atof (argv[3]);
pathb = argv[4];
binpoint = (gdouble) atof (argv[5]);
bdur = (gdouble) atof (argv[6]);
pipeline =
make_timeline (type, tdur, patha, adur, ainpoint, pathb, bdur, binpoint);
mainloop = g_main_loop_new (NULL, FALSE);
g_timeout_add_seconds ((adur + bdur) + 1, (GSourceFunc) g_main_loop_quit,
mainloop);
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
g_main_loop_run (mainloop);
return 0;
}

1
examples/meson.build Normal file
View file

@ -0,0 +1 @@
subdir('c')

50
examples/python/gst-player.py Executable file
View file

@ -0,0 +1,50 @@
#!/usr/bin/python3
import sys
import gi
gi.require_version('Gst', '1.0')
gi.require_version('GES', '1.0')
gi.require_version('GstPlayer', '1.0')
gi.require_version('GLib', '2.0')
from gi.repository import Gst, GES, GLib, GstPlayer
if __name__ == "__main__":
if len(sys.argv) < 2:
print("You must specify a file URI")
sys.exit(-1)
Gst.init(None)
GES.init()
timeline = GES.Timeline.new_audio_video()
layer = timeline.append_layer()
start = 0
for uri in sys.argv[1:]:
if not Gst.uri_is_valid(uri):
uri = Gst.filename_to_uri(uri)
clip = GES.UriClip.new(uri)
clip.props.start = start
layer.add_clip(clip)
start += clip.props.duration
player = GstPlayer
player = GstPlayer.Player.new(None, GstPlayer.PlayerGMainContextSignalDispatcher.new(None))
player.set_uri("ges://")
player.get_pipeline().connect("source-setup",
lambda playbin, source: source.set_property("timeline", timeline))
loop = GLib.MainLoop()
player.connect("end-of-stream", lambda x: loop.quit())
def error(player, err):
loop.quit()
print("Got error: %s" % err)
sys.exit(1)
player.connect("error", error)
player.play()
loop.run()

87
examples/python/keyframes.py Executable file
View file

@ -0,0 +1,87 @@
#!/usr/bin/env python3
#
# GStreamer
#
# Copyright (C) 2019 Thibault Saunier <tsaunier@igalia.com>
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Library General Public
# License as published by the Free Software Foundation; either
# version 2 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Library General Public License for more details.
#
# You should have received a copy of the GNU Library General Public
# License along with this library; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Suite 500,
# Boston, MA 02110-1335, USA.
import gi
import sys
gi.require_version('Gst', '1.0')
gi.require_version('GES', '1.0')
gi.require_version('GstController', '1.0')
from gi.repository import Gst, GES, GLib, GstController # noqa
Gst.init(None)
GES.init()
def play_timeline(timeline):
pipeline = GES.Pipeline()
pipeline.set_timeline(timeline)
bus = pipeline.get_bus()
bus.add_signal_watch()
loop = GLib.MainLoop()
bus.connect("message", bus_message_cb, loop, pipeline)
pipeline.set_state(Gst.State.PLAYING)
loop.run()
def bus_message_cb(unused_bus, message, loop, pipeline):
if message.type == Gst.MessageType.EOS:
print("eos")
pipeline.set_state(Gst.State.NULL)
loop.quit()
elif message.type == Gst.MessageType.ERROR:
error = message.parse_error()
pipeline.set_state(Gst.State.NULL)
print("error %s" % error[1])
loop.quit()
if __name__ == "__main__":
if len(sys.argv) != 2:
print("You must specify a file path")
exit(-1)
timeline = GES.Timeline.new_audio_video()
layer = timeline.append_layer()
clip = GES.UriClip.new(Gst.filename_to_uri(sys.argv[1]))
# Adding clip to the layer so the TrackElements are created
layer.add_clip(clip)
# Create an InterpolationControlSource and make sure it interpolates linearly
control_source = GstController.InterpolationControlSource.new()
control_source.props.mode = GstController.InterpolationMode.LINEAR
# Set the keyframes
control_source.set(0, 0.0) # Fully transparent at 0 second
control_source.set(Gst.SECOND, 1.0) # Fully opaque at 1 second
# Get the video source
video_source = clip.find_track_element(None, GES.VideoSource)
assert(video_source)
# And set the control source on the "alpha" property of the video source
# Using a "direct" binding but "direct-absolute" would work the exact
# same way as the alpha property range is [0.0 - 1.0] anyway.
video_source.set_control_source(control_source, "alpha", "direct")
play_timeline(timeline)

65
examples/python/simple.py Executable file
View file

@ -0,0 +1,65 @@
#!/usr/bin/env python3
#
# GStreamer
#
# Copyright (C) 2013 Thibault Saunier <tsaunier@gnome.org
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Library General Public
# License as published by the Free Software Foundation; either
# version 2 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Library General Public License for more details.
#
# You should have received a copy of the GNU Library General Public
# License along with this library; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Suite 500,
# Boston, MA 02110-1335, USA.
import os
import gi
gi.require_version('Gst', '1.0')
gi.require_version('GES', '1.0')
from gi.repository import Gst, GES, GLib # noqa
class Simple:
def __init__(self, uri):
timeline = GES.Timeline.new_audio_video()
layer = timeline.append_layer()
layer.add_clip(GES.UriClip.new(uri))
self.pipeline = pipeline = GES.Pipeline()
pipeline.set_timeline(timeline)
pipeline.set_state(Gst.State.PLAYING)
bus = pipeline.get_bus()
bus.add_signal_watch()
bus.connect("message", self.bus_message_cb)
self.loop = GLib.MainLoop()
def bus_message_cb(self, unused_bus, message):
if message.type == Gst.MessageType.EOS:
print("eos")
self.loop.quit()
elif message.type == Gst.MessageType.ERROR:
error = message.parse_error()
print("error %s" % error[1])
self.loop.quit()
def start(self):
self.loop.run()
if __name__ == "__main__":
if len(os.sys.argv) != 2:
print("You must specify a file URI")
exit(-1)
Gst.init(None)
GES.init()
simple = Simple(os.sys.argv[1])
simple.start()

1671
ges/ges-asset.c Normal file

File diff suppressed because it is too large Load diff

146
ges/ges-asset.h Normal file
View file

@ -0,0 +1,146 @@
/* GStreamer Editing Services
*
* Copyright (C) 2012 Thibault Saunier <thibault.saunier@collabora.com>
* Copyright (C) 2012 Volodymyr Rudyi <vladimir.rudoy@gmail.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 59 Temple Place - Suite 330,
* Boston, MA 02111-1307, USA.
*/
#pragma once
#include <glib-object.h>
#include <ges/ges-extractable.h>
#include <ges/ges-types.h>
#include <ges/ges-enums.h>
#include <gio/gio.h>
#include <gst/gst.h>
G_BEGIN_DECLS
#define GES_TYPE_ASSET ges_asset_get_type()
GES_DECLARE_TYPE(Asset, asset, ASSET);
/**
* GESAssetLoadingReturn:
* @GES_ASSET_LOADING_ERROR: Indicates that an error occurred
* @GES_ASSET_LOADING_ASYNC: Indicates that the loading is being performed
* asynchronously
* @GES_ASSET_LOADING_OK: Indicates that the loading is complete, without
* error
*/
typedef enum
{
GES_ASSET_LOADING_ERROR,
GES_ASSET_LOADING_ASYNC,
GES_ASSET_LOADING_OK
} GESAssetLoadingReturn;
struct _GESAsset
{
GObject parent;
/* <private> */
GESAssetPrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
/**
* GESAssetClass:
* @start_loading: A method to be called when an asset is being requested
* asynchronously. This will be after the properties of the asset have
* been set, so it is tasked with (re)loading the 'state' of the asset.
* The return value should indicated whether the loading is complete, is
* carrying on asynchronously, or an error occurred. The default
* implementation will simply return that loading is already complete (the
* asset is already in a usable state after the properties have been set).
* @extract: A method that returns a new object of the asset's
* #GESAsset:extractable-type, or %NULL if an error occurs. The default
* implementation will fetch the properties of the #GESExtractable from
* its get_parameters_from_id() class method and set them on a new
* #GESAsset:extractable-type #GObject, which is returned.
* @request_id_update: A method called by a #GESProject when an asset has
* failed to load. @error is the error given by
* ges_asset_request_finish (). Returns: %TRUE if a new id for @self was
* passed to @proposed_new_id.
* @proxied: Deprecated: 1.18: This vmethod is no longer called.
*/
/* FIXME: add documentation for inform_proxy when it is used properly */
struct _GESAssetClass
{
GObjectClass parent;
GESAssetLoadingReturn (*start_loading) (GESAsset *self,
GError **error);
GESExtractable* (*extract) (GESAsset *self,
GError **error);
/* Let subclasses know that we proxied an asset */
void (*inform_proxy) (GESAsset *self,
const gchar *proxy_id);
void (*proxied) (GESAsset *self,
GESAsset *proxy);
/* Ask subclasses for a new ID for @self when the asset failed loading
* This function returns %FALSE when the ID could be updated or %TRUE
* otherwize */
gboolean (*request_id_update) (GESAsset *self,
gchar **proposed_new_id,
GError *error) ;
gpointer _ges_reserved[GES_PADDING];
};
GES_API
GType ges_asset_get_extractable_type (GESAsset * self);
GES_API
void ges_asset_request_async (GType extractable_type,
const gchar * id,
GCancellable *cancellable,
GAsyncReadyCallback callback,
gpointer user_data);
GES_API
GESAsset * ges_asset_request (GType extractable_type,
const gchar * id,
GError **error);
GES_API
const gchar * ges_asset_get_id (GESAsset* self);
GES_API
GESAsset * ges_asset_request_finish (GAsyncResult *res,
GError **error);
GES_API
GError * ges_asset_get_error (GESAsset * self);
GES_API
GESExtractable * ges_asset_extract (GESAsset * self,
GError **error);
GES_API
GList * ges_list_assets (GType filter);
GES_API
gboolean ges_asset_set_proxy (GESAsset *asset, GESAsset *proxy);
GES_API
gboolean ges_asset_unproxy (GESAsset *asset, GESAsset * proxy);
GES_API
GList * ges_asset_list_proxies (GESAsset *asset);
GES_API
GESAsset * ges_asset_get_proxy_target(GESAsset *proxy);
GES_API
GESAsset * ges_asset_get_proxy (GESAsset *asset);
GES_API
gboolean ges_asset_needs_reload (GType extractable_type,
const gchar * id);
G_END_DECLS

190
ges/ges-audio-source.c Normal file
View file

@ -0,0 +1,190 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
/**
* SECTION:gesaudiosource
* @title: GESAudioSource
* @short_description: Base Class for audio sources
*
* ## Children Properties
*
* You can use the following children properties through the
* #ges_track_element_set_child_property and alike set of methods:
*
* - #gdouble `volume`: volume factor, 1.0=100%.
* - #gboolean `mute`: mute channel.
*
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include "ges-internal.h"
#include "ges/ges-meta-container.h"
#include "ges-track-element.h"
#include "ges-audio-source.h"
#include "ges-layer.h"
struct _GESAudioSourcePrivate
{
GstElement *capsfilter;
GESTrack *current_track;
};
G_DEFINE_ABSTRACT_TYPE_WITH_PRIVATE (GESAudioSource, ges_audio_source,
GES_TYPE_SOURCE);
static void
_sync_element_to_layer_property_float (GESTrackElement * trksrc,
GstElement * element, const gchar * meta, const gchar * propname)
{
GESTimelineElement *parent;
GESLayer *layer;
gfloat value;
parent = ges_timeline_element_get_parent (GES_TIMELINE_ELEMENT (trksrc));
if (!parent) {
GST_DEBUG_OBJECT (trksrc, "Not in a clip... doing nothing");
return;
}
layer = ges_clip_get_layer (GES_CLIP (parent));
gst_object_unref (parent);
if (layer != NULL) {
ges_meta_container_get_float (GES_META_CONTAINER (layer), meta, &value);
g_object_set (element, propname, value, NULL);
GST_DEBUG_OBJECT (trksrc, "Setting %s to %f", propname, value);
gst_object_unref (layer);
} else {
GST_DEBUG_OBJECT (trksrc, "NOT setting the %s", propname);
}
}
static void
restriction_caps_cb (GESTrack * track,
GParamSpec * arg G_GNUC_UNUSED, GESAudioSource * self)
{
GstCaps *caps;
g_object_get (track, "restriction-caps", &caps, NULL);
GST_DEBUG_OBJECT (self, "Setting capsfilter caps to %" GST_PTR_FORMAT, caps);
g_object_set (self->priv->capsfilter, "caps", caps, NULL);
if (caps)
gst_caps_unref (caps);
}
static void
_track_changed_cb (GESAudioSource * self, GParamSpec * arg G_GNUC_UNUSED,
gpointer udata)
{
GESTrack *track = ges_track_element_get_track (GES_TRACK_ELEMENT (self));
if (self->priv->current_track) {
g_signal_handlers_disconnect_by_func (self->priv->current_track,
(GCallback) restriction_caps_cb, self);
}
self->priv->current_track = track;
if (track) {
restriction_caps_cb (track, NULL, self);
g_signal_connect (track, "notify::restriction-caps",
G_CALLBACK (restriction_caps_cb), self);
}
}
static GstElement *
ges_audio_source_create_element (GESTrackElement * trksrc)
{
GstElement *volume, *vbin;
GstElement *topbin;
GstElement *sub_element;
GPtrArray *elements;
GESSourceClass *source_class = GES_SOURCE_GET_CLASS (trksrc);
const gchar *props[] = { "volume", "mute", NULL };
GESAudioSource *self = GES_AUDIO_SOURCE (trksrc);
g_assert (source_class->create_source);
sub_element = source_class->create_source (GES_SOURCE (trksrc));
GST_DEBUG_OBJECT (trksrc, "Creating a bin sub_element ! volume");
vbin =
gst_parse_bin_from_description
("audioconvert ! audioresample ! volume name=v ! capsfilter name=audio-track-caps-filter",
TRUE, NULL);
elements = g_ptr_array_new ();
g_ptr_array_add (elements, vbin);
topbin = ges_source_create_topbin (GES_SOURCE (trksrc), "audiosrcbin",
sub_element, elements);
volume = gst_bin_get_by_name (GST_BIN (vbin), "v");
self->priv->capsfilter = gst_bin_get_by_name (GST_BIN (vbin),
"audio-track-caps-filter");
g_signal_connect (self, "notify::track", (GCallback) _track_changed_cb, NULL);
_track_changed_cb (self, NULL, NULL);
_sync_element_to_layer_property_float (trksrc, volume, GES_META_VOLUME,
"volume");
ges_track_element_add_children_props (trksrc, volume, NULL, NULL, props);
gst_object_unref (volume);
return topbin;
}
static void
ges_audio_source_dispose (GObject * object)
{
GESAudioSource *self = GES_AUDIO_SOURCE (object);
if (self->priv->capsfilter) {
gst_object_unref (self->priv->capsfilter);
self->priv->capsfilter = NULL;
}
G_OBJECT_CLASS (ges_audio_source_parent_class)->dispose (object);
}
static void
ges_audio_source_class_init (GESAudioSourceClass * klass)
{
GObjectClass *gobject_class = G_OBJECT_CLASS (klass);
GESTrackElementClass *track_class = GES_TRACK_ELEMENT_CLASS (klass);
gobject_class->dispose = ges_audio_source_dispose;
track_class->nleobject_factorytype = "nlesource";
track_class->create_element = ges_audio_source_create_element;
track_class->ABI.abi.default_track_type = GES_TRACK_TYPE_AUDIO;
}
static void
ges_audio_source_init (GESAudioSource * self)
{
self->priv = ges_audio_source_get_instance_private (self);
}

74
ges/ges-audio-source.h Normal file
View file

@ -0,0 +1,74 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include <glib-object.h>
#include <gst/gst.h>
#include <ges/ges-types.h>
#include <ges/ges-track-element.h>
#include <ges/ges-source.h>
G_BEGIN_DECLS
#define GES_TYPE_AUDIO_SOURCE ges_audio_source_get_type()
GES_DECLARE_TYPE(AudioSource, audio_source, AUDIO_SOURCE);
/**
* GESAudioSource:
*
* Base class for audio sources
*/
struct _GESAudioSource {
/*< private >*/
GESSource parent;
GESAudioSourcePrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
/**
* GESAudioSourceClass:
*/
struct _GESAudioSourceClass {
/*< private >*/
GESSourceClass parent_class;
/*< public >*/
/**
* GESAudioSource::create_element:
* @object: The #GESTrackElement
*
* Returns: (transfer floating): the #GstElement that the underlying nleobject
* controls.
*
* Deprecated: 1.20: Use #GESSourceClass::create_element instead.
*/
GstElement* (*create_source) (GESTrackElement * object);
/*< private >*/
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
G_END_DECLS

217
ges/ges-audio-test-source.c Normal file
View file

@ -0,0 +1,217 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Brandon Lewis <brandon.lewis@collabora.co.uk>
* 2010 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
/**
* SECTION:gesaudiotestsource
* @title: GESAudioTestSource
* @short_description: produce a simple test waveform or silence
*
* Outputs a test audio stream using audiotestsrc. The default property values
* output silence. Useful for testing pipelines, or to fill gaps in an audio
* track.
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include "ges-internal.h"
#include "ges-track-element.h"
#include "ges-audio-test-source.h"
#define DEFAULT_VOLUME 1.0
struct _GESAudioTestSourcePrivate
{
gdouble freq;
gdouble volume;
};
enum
{
PROP_0,
};
G_DEFINE_TYPE_WITH_PRIVATE (GESAudioTestSource, ges_audio_test_source,
GES_TYPE_AUDIO_SOURCE);
static void ges_audio_test_source_get_property (GObject * object, guint
property_id, GValue * value, GParamSpec * pspec);
static void ges_audio_test_source_set_property (GObject * object, guint
property_id, const GValue * value, GParamSpec * pspec);
static GstElement *ges_audio_test_source_create_source (GESSource * source);
static void
ges_audio_test_source_class_init (GESAudioTestSourceClass * klass)
{
GObjectClass *object_class = G_OBJECT_CLASS (klass);
GESSourceClass *source_class = GES_SOURCE_CLASS (klass);
object_class->get_property = ges_audio_test_source_get_property;
object_class->set_property = ges_audio_test_source_set_property;
source_class->create_source = ges_audio_test_source_create_source;
}
static void
ges_audio_test_source_init (GESAudioTestSource * self)
{
self->priv = ges_audio_test_source_get_instance_private (self);
self->priv->freq = 440;
self->priv->volume = DEFAULT_VOLUME;
}
static void
ges_audio_test_source_get_property (GObject * object,
guint property_id, GValue * value, GParamSpec * pspec)
{
switch (property_id) {
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec);
}
}
static void
ges_audio_test_source_set_property (GObject * object,
guint property_id, const GValue * value, GParamSpec * pspec)
{
switch (property_id) {
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec);
}
}
static GstElement *
ges_audio_test_source_create_source (GESSource * source)
{
GESAudioTestSource *self;
GstElement *ret;
const gchar *props[] = { "volume", "freq", NULL };
self = (GESAudioTestSource *) source;
ret = gst_element_factory_make ("audiotestsrc", NULL);
g_object_set (ret, "volume", (gdouble) self->priv->volume, "freq", (gdouble)
self->priv->freq, NULL);
ges_track_element_add_children_props (GES_TRACK_ELEMENT (self), ret, NULL,
NULL, props);
return ret;
}
/**
* ges_audio_test_source_set_freq:
* @self: a #GESAudioTestSource
* @freq: The frequency you want to apply on @self
*
* Lets you set the frequency applied on the track element
*/
void
ges_audio_test_source_set_freq (GESAudioTestSource * self, gdouble freq)
{
GstElement *element =
ges_track_element_get_element (GES_TRACK_ELEMENT (self));
self->priv->freq = freq;
if (element) {
GValue val = { 0 };
g_value_init (&val, G_TYPE_DOUBLE);
g_value_set_double (&val, freq);
ges_track_element_set_child_property (GES_TRACK_ELEMENT (self), "freq",
&val);
}
}
/**
* ges_audio_test_source_set_volume:
* @self: a #GESAudioTestSource
* @volume: The volume you want to apply on @self
*
* Sets the volume of the test audio signal.
*/
void
ges_audio_test_source_set_volume (GESAudioTestSource * self, gdouble volume)
{
GstElement *element =
ges_track_element_get_element (GES_TRACK_ELEMENT (self));
self->priv->volume = volume;
if (element) {
GValue val = { 0 };
g_value_init (&val, G_TYPE_DOUBLE);
g_value_set_double (&val, volume);
ges_track_element_set_child_property (GES_TRACK_ELEMENT (self), "volume",
&val);
}
}
/**
* ges_audio_test_source_get_freq:
* @self: a #GESAudioTestSource
*
* Get the current frequency of @self.
*
* Returns: The current frequency of @self.
*/
double
ges_audio_test_source_get_freq (GESAudioTestSource * self)
{
GValue val = { 0 };
ges_track_element_get_child_property (GES_TRACK_ELEMENT (self), "freq", &val);
return g_value_get_double (&val);
}
/**
* ges_audio_test_source_get_volume:
* @self: a #GESAudioTestSource
*
* Get the current volume of @self.
*
* Returns: The current volume of @self
*/
double
ges_audio_test_source_get_volume (GESAudioTestSource * self)
{
GValue val = { 0 };
ges_track_element_get_child_property (GES_TRACK_ELEMENT (self), "volume",
&val);
return g_value_get_double (&val);
}
/* Creates a new #GESAudioTestSource.
*
* Returns: (transfer floating) (nullable): The newly created #GESAudioTestSource.
*/
GESAudioTestSource *
ges_audio_test_source_new (void)
{
GESAudioTestSource *res;
GESAsset *asset = ges_asset_request (GES_TYPE_AUDIO_TEST_SOURCE, NULL, NULL);
res = GES_AUDIO_TEST_SOURCE (ges_asset_extract (asset, NULL));
gst_object_unref (asset);
return res;
}

View file

@ -0,0 +1,70 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Brandon Lewis <brandon.lewis@collabora.co.uk>
* 2010 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include <glib-object.h>
#include <ges/ges-types.h>
#include <ges/ges-audio-source.h>
G_BEGIN_DECLS
#define GES_TYPE_AUDIO_TEST_SOURCE ges_audio_test_source_get_type()
GES_DECLARE_TYPE(AudioTestSource, audio_test_source, AUDIO_TEST_SOURCE);
/**
* GESAudioTestSource:
*
* ### Children Properties
*
* {{ libs/GESAudioTestSource-children-props.md }}
*/
struct _GESAudioTestSource {
GESAudioSource parent;
/*< private >*/
GESAudioTestSourcePrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
struct _GESAudioTestSourceClass {
/*< private >*/
GESAudioSourceClass parent_class;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
GES_API
void ges_audio_test_source_set_freq(GESAudioTestSource *self,
gdouble freq);
GES_API
void ges_audio_test_source_set_volume(GESAudioTestSource *self,
gdouble volume);
GES_API
double ges_audio_test_source_get_freq(GESAudioTestSource *self);
GES_API
double ges_audio_test_source_get_volume(GESAudioTestSource *self);
G_END_DECLS

206
ges/ges-audio-track.c Normal file
View file

@ -0,0 +1,206 @@
/* GStreamer Editing Services
* Copyright (C) <2013> Thibault Saunier <thibault.saunier@collabora.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
/**
* SECTION: gesaudiotrack
* @title: GESAudioTrack
* @short_description: A standard #GESTrack for raw audio
*
* A #GESAudioTrack is a default audio #GESTrack, with a
* #GES_TRACK_TYPE_AUDIO #GESTrack:track-type and "audio/x-raw(ANY)"
* #GESTrack:caps.
*
* By default, an audio track will have its #GESTrack:restriction-caps
* set to "audio/x-raw" with the following properties:
*
* - format: "S32LE"
* - channels: 2
* - rate: 44100
* - layout: "interleaved"
*
* These fields are needed for negotiation purposes, but you can change
* their values if you wish. It is advised that you do so using
* ges_track_update_restriction_caps() with new values for the fields you
* wish to change, and any additional fields you may want to add. Unlike
* using ges_track_set_restriction_caps(), this will ensure that these
* default fields will at least have some value set.
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include "ges-internal.h"
#include "ges-smart-adder.h"
#include "ges-audio-track.h"
#define DEFAULT_CAPS "audio/x-raw"
#if G_BYTE_ORDER == G_LITTLE_ENDIAN
#define DEFAULT_RESTRICTION_CAPS "audio/x-raw, format=S32LE, channels=2, "\
"rate=44100, layout=interleaved"
#else
#define DEFAULT_RESTRICTION_CAPS "audio/x-raw, format=S32BE, channels=2, "\
"rate=44100, layout=interleaved"
#endif
struct _GESAudioTrackPrivate
{
gpointer nothing;
};
G_DEFINE_TYPE_WITH_PRIVATE (GESAudioTrack, ges_audio_track, GES_TYPE_TRACK);
/****************************************************
* Private methods and utils *
****************************************************/
static void
_sync_capsfilter_with_track (GESTrack * track, GstElement * capsfilter)
{
GstCaps *restriction, *caps;
gint rate;
GstStructure *structure;
g_object_get (track, "restriction-caps", &restriction, NULL);
if (restriction == NULL)
return;
if (gst_caps_get_size (restriction) == 0)
goto done;
structure = gst_caps_get_structure (restriction, 0);
if (!gst_structure_get_int (structure, "rate", &rate))
goto done;
caps = gst_caps_new_simple ("audio/x-raw", "rate", G_TYPE_INT, rate, NULL);
g_object_set (capsfilter, "caps", caps, NULL);
gst_caps_unref (caps);
done:
gst_caps_unref (restriction);
}
static void
_track_restriction_changed_cb (GESTrack * track, GParamSpec * arg G_GNUC_UNUSED,
GstElement * capsfilter)
{
_sync_capsfilter_with_track (track, capsfilter);
}
static void
_weak_notify_cb (GESTrack * track, GstElement * capsfilter)
{
g_signal_handlers_disconnect_by_func (track,
(GCallback) _track_restriction_changed_cb, capsfilter);
}
static GstElement *
create_element_for_raw_audio_gap (GESTrack * track)
{
GstElement *bin;
GstElement *capsfilter;
bin = gst_parse_bin_from_description
("audiotestsrc wave=silence name=src ! audioconvert ! audioresample ! audioconvert ! capsfilter name=gapfilter caps=audio/x-raw",
TRUE, NULL);
capsfilter = gst_bin_get_by_name (GST_BIN (bin), "gapfilter");
g_object_weak_ref (G_OBJECT (capsfilter), (GWeakNotify) _weak_notify_cb,
track);
g_signal_connect (track, "notify::restriction-caps",
(GCallback) _track_restriction_changed_cb, capsfilter);
_sync_capsfilter_with_track (track, capsfilter);
gst_object_unref (capsfilter);
return bin;
}
/****************************************************
* GObject vmethods implementations *
****************************************************/
static void
ges_audio_track_init (GESAudioTrack * self)
{
self->priv = ges_audio_track_get_instance_private (self);
}
static void
ges_audio_track_finalize (GObject * object)
{
/* TODO: Add deinitalization code here */
G_OBJECT_CLASS (ges_audio_track_parent_class)->finalize (object);
}
static void
ges_audio_track_class_init (GESAudioTrackClass * klass)
{
GObjectClass *object_class = G_OBJECT_CLASS (klass);
/* GESTrackClass *parent_class = GES_TRACK_CLASS (klass);
*/
object_class->finalize = ges_audio_track_finalize;
GES_TRACK_CLASS (klass)->get_mixing_element = ges_smart_adder_new;
}
/****************************************************
* API implementation *
****************************************************/
/**
* ges_audio_track_new:
*
* Creates a new audio track, with a #GES_TRACK_TYPE_AUDIO
* #GESTrack:track-type, "audio/x-raw(ANY)" #GESTrack:caps, and
* "audio/x-raw" #GESTrack:restriction-caps with the properties:
*
* - format: "S32LE"
* - channels: 2
* - rate: 44100
* - layout: "interleaved"
*
* You should use ges_track_update_restriction_caps() if you wish to
* modify these fields, or add additional ones.
*
* Returns: (transfer floating): The newly created audio track.
*/
GESAudioTrack *
ges_audio_track_new (void)
{
GESAudioTrack *ret;
GstCaps *caps = gst_caps_from_string (DEFAULT_CAPS);
GstCaps *restriction_caps = gst_caps_from_string (DEFAULT_RESTRICTION_CAPS);
ret = g_object_new (GES_TYPE_AUDIO_TRACK, "caps", caps,
"track-type", GES_TRACK_TYPE_AUDIO, NULL);
ges_track_set_create_element_for_gap_func (GES_TRACK (ret),
create_element_for_raw_audio_gap);
ges_track_set_restriction_caps (GES_TRACK (ret), restriction_caps);
gst_caps_unref (caps);
gst_caps_unref (restriction_caps);
return ret;
}

54
ges/ges-audio-track.h Normal file
View file

@ -0,0 +1,54 @@
/* GStreamer Editing Services
* Copyright (C) <2013> Thibault Saunier <thibault.saunier@collabora.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include <glib-object.h>
#include "ges-track.h"
#include "ges-types.h"
G_BEGIN_DECLS
#define GES_TYPE_AUDIO_TRACK (ges_audio_track_get_type ())
GES_DECLARE_TYPE(AudioTrack, audio_track, AUDIO_TRACK);
struct _GESAudioTrackClass
{
GESTrackClass parent_class;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
struct _GESAudioTrack
{
GESTrack parent_instance;
/*< private >*/
GESAudioTrackPrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
GES_API
GESAudioTrack* ges_audio_track_new (void);
G_END_DECLS

307
ges/ges-audio-transition.c Normal file
View file

@ -0,0 +1,307 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Brandon Lewis <brandon.lewis@collabora.co.uk>
* 2010 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
/**
* SECTION:gesaudiotransition
* @title: GESAudioTransition
* @short_description: implements audio crossfade transition
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include "ges-internal.h"
#include "ges-track-element.h"
#include "ges-audio-transition.h"
#include <gst/controller/gstdirectcontrolbinding.h>
struct _GESAudioTransitionPrivate
{
/* these enable volume interpolation. Unlike video, both inputs are adjusted
* simultaneously */
GstControlSource *a_control_source;
GstControlSource *b_control_source;
};
enum
{
PROP_0,
};
G_DEFINE_TYPE_WITH_PRIVATE (GESAudioTransition, ges_audio_transition,
GES_TYPE_TRANSITION);
#define fast_element_link(a,b) gst_element_link_pads_full((a),"src",(b),"sink",GST_PAD_LINK_CHECK_NOTHING)
static void
ges_audio_transition_duration_changed (GESTrackElement * self, guint64);
static GstElement *ges_audio_transition_create_element (GESTrackElement * self);
static void ges_audio_transition_dispose (GObject * object);
static void ges_audio_transition_finalize (GObject * object);
static void ges_audio_transition_get_property (GObject * object, guint
property_id, GValue * value, GParamSpec * pspec);
static void ges_audio_transition_set_property (GObject * object, guint
property_id, const GValue * value, GParamSpec * pspec);
static void
duration_changed_cb (GESTrackElement * self, GParamSpec * arg G_GNUC_UNUSED)
{
ges_audio_transition_duration_changed (self,
ges_timeline_element_get_duration (GES_TIMELINE_ELEMENT (self)));
}
static void
ges_audio_transition_class_init (GESAudioTransitionClass * klass)
{
GObjectClass *object_class;
GESTrackElementClass *toclass;
object_class = G_OBJECT_CLASS (klass);
toclass = GES_TRACK_ELEMENT_CLASS (klass);
object_class->get_property = ges_audio_transition_get_property;
object_class->set_property = ges_audio_transition_set_property;
object_class->dispose = ges_audio_transition_dispose;
object_class->finalize = ges_audio_transition_finalize;
toclass->create_element = ges_audio_transition_create_element;
toclass->ABI.abi.default_track_type = GES_TRACK_TYPE_AUDIO;
}
static void
ges_audio_transition_init (GESAudioTransition * self)
{
self->priv = ges_audio_transition_get_instance_private (self);
}
static void
ges_audio_transition_dispose (GObject * object)
{
GESAudioTransition *self;
self = GES_AUDIO_TRANSITION (object);
if (self->priv->a_control_source) {
if (self->priv->a_control_source)
gst_object_unref (self->priv->a_control_source);
self->priv->a_control_source = NULL;
}
if (self->priv->b_control_source) {
if (self->priv->b_control_source)
gst_object_unref (self->priv->b_control_source);
self->priv->b_control_source = NULL;
}
g_signal_handlers_disconnect_by_func (GES_TRACK_ELEMENT (self),
duration_changed_cb, NULL);
G_OBJECT_CLASS (ges_audio_transition_parent_class)->dispose (object);
}
static void
ges_audio_transition_finalize (GObject * object)
{
G_OBJECT_CLASS (ges_audio_transition_parent_class)->finalize (object);
}
static void
ges_audio_transition_get_property (GObject * object,
guint property_id, GValue * value, GParamSpec * pspec)
{
switch (property_id) {
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec);
}
}
static void
ges_audio_transition_set_property (GObject * object,
guint property_id, const GValue * value, GParamSpec * pspec)
{
switch (property_id) {
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec);
}
}
static GObject *
link_element_to_mixer_with_volume (GstBin * bin, GstElement * element,
GstElement * mixer)
{
GstElement *volume = gst_element_factory_make ("volume", NULL);
GstElement *resample = gst_element_factory_make ("audioresample", NULL);
gst_bin_add (bin, volume);
gst_bin_add (bin, resample);
if (!fast_element_link (element, volume) ||
!fast_element_link (volume, resample) ||
!gst_element_link_pads_full (resample, "src", mixer, "sink_%u",
GST_PAD_LINK_CHECK_NOTHING))
GST_ERROR_OBJECT (bin, "Error linking volume to mixer");
return G_OBJECT (volume);
}
static GstElement *
ges_audio_transition_create_element (GESTrackElement * track_element)
{
GESAudioTransition *self;
GstElement *topbin, *iconva, *iconvb, *oconv;
GObject *atarget, *btarget = NULL;
const gchar *propname = "volume";
GstElement *mixer = NULL;
GstPad *sinka_target, *sinkb_target, *src_target, *sinka, *sinkb, *src;
guint64 duration;
GstControlSource *acontrol_source, *bcontrol_source;
self = GES_AUDIO_TRANSITION (track_element);
GST_LOG ("creating an audio bin");
topbin = gst_bin_new ("transition-bin");
iconva = gst_element_factory_make ("audioconvert", "tr-aconv-a");
iconvb = gst_element_factory_make ("audioconvert", "tr-aconv-b");
oconv = gst_element_factory_make ("audioconvert", "tr-aconv-output");
gst_bin_add_many (GST_BIN (topbin), iconva, iconvb, oconv, NULL);
mixer = gst_element_factory_make ("audiomixer", NULL);
gst_bin_add (GST_BIN (topbin), mixer);
atarget = link_element_to_mixer_with_volume (GST_BIN (topbin), iconva, mixer);
btarget = link_element_to_mixer_with_volume (GST_BIN (topbin), iconvb, mixer);
g_assert (atarget && btarget);
fast_element_link (mixer, oconv);
sinka_target = gst_element_get_static_pad (iconva, "sink");
sinkb_target = gst_element_get_static_pad (iconvb, "sink");
src_target = gst_element_get_static_pad (oconv, "src");
sinka = gst_ghost_pad_new ("sinka", sinka_target);
sinkb = gst_ghost_pad_new ("sinkb", sinkb_target);
src = gst_ghost_pad_new ("src", src_target);
gst_element_add_pad (topbin, src);
gst_element_add_pad (topbin, sinka);
gst_element_add_pad (topbin, sinkb);
/* set up interpolation */
gst_object_unref (sinka_target);
gst_object_unref (sinkb_target);
gst_object_unref (src_target);
acontrol_source = gst_interpolation_control_source_new ();
g_object_set (acontrol_source, "mode", GST_INTERPOLATION_MODE_LINEAR, NULL);
bcontrol_source = gst_interpolation_control_source_new ();
g_object_set (bcontrol_source, "mode", GST_INTERPOLATION_MODE_LINEAR, NULL);
self->priv->a_control_source = acontrol_source;
self->priv->b_control_source = bcontrol_source;
duration =
ges_timeline_element_get_duration (GES_TIMELINE_ELEMENT (track_element));
ges_audio_transition_duration_changed (track_element, duration);
g_signal_connect (track_element, "notify::duration",
G_CALLBACK (duration_changed_cb), NULL);
gst_object_add_control_binding (GST_OBJECT (atarget),
gst_direct_control_binding_new (GST_OBJECT (atarget), propname,
acontrol_source));
gst_object_add_control_binding (GST_OBJECT (btarget),
gst_direct_control_binding_new (GST_OBJECT (btarget), propname,
bcontrol_source));
self->priv->a_control_source = acontrol_source;
self->priv->b_control_source = bcontrol_source;
return topbin;
}
static void
ges_audio_transition_duration_changed (GESTrackElement * track_element,
guint64 duration)
{
GESAudioTransition *self;
GstElement *nleobj = ges_track_element_get_nleobject (track_element);
GstTimedValueControlSource *ta, *tb;
self = GES_AUDIO_TRANSITION (track_element);
GST_INFO ("updating controller: nleobj (%p)", nleobj);
if (G_UNLIKELY ((!self->priv->a_control_source ||
!self->priv->b_control_source)))
return;
GST_INFO ("setting values on controller");
ta = GST_TIMED_VALUE_CONTROL_SOURCE (self->priv->a_control_source);
tb = GST_TIMED_VALUE_CONTROL_SOURCE (self->priv->b_control_source);
gst_timed_value_control_source_unset_all (ta);
gst_timed_value_control_source_unset_all (tb);
/* The volume property goes from 0 to 10, so we want to interpolate between
* 0 and 0.1 */
gst_timed_value_control_source_set (ta, 0, 0.1);
gst_timed_value_control_source_set (ta, duration, 0.0);
gst_timed_value_control_source_set (tb, 0, 0.0);
gst_timed_value_control_source_set (tb, duration, 0.1);
GST_INFO ("done updating controller");
}
/**
* ges_audio_transition_new:
*
* Creates a new #GESAudioTransition.
*
* Returns: (transfer floating): The newly created #GESAudioTransition.
*
* Deprecated: 1.18: This should never be called by applications as this will
* be created by clips.
*/
GESAudioTransition *
ges_audio_transition_new (void)
{
GESAudioTransition *res;
GESAsset *asset = ges_asset_request (GES_TYPE_AUDIO_TRANSITION, NULL, NULL);
res = GES_AUDIO_TRANSITION (ges_asset_extract (asset, NULL));
gst_object_unref (asset);
return res;
}

View file

@ -0,0 +1,57 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Brandon Lewis <brandon.lewis@collabora.co.uk>
* 2010 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include <glib-object.h>
#include <ges/ges-types.h>
#include <ges/ges-transition.h>
G_BEGIN_DECLS
#define GES_TYPE_AUDIO_TRANSITION ges_audio_transition_get_type()
GES_DECLARE_TYPE(AudioTransition, audio_transition, AUDIO_TRANSITION);
/**
* GESAudioTransition:
*
*/
struct _GESAudioTransition {
GESTransition parent;
/*< private >*/
GESAudioTransitionPrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
struct _GESAudioTransitionClass {
GESTransitionClass parent_class;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
GES_DEPRECATED
GESAudioTransition* ges_audio_transition_new (void);
G_END_DECLS

180
ges/ges-audio-uri-source.c Normal file
View file

@ -0,0 +1,180 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
/**
* SECTION:gesaudiourisource
* @title: GESAudioUriSource
* @short_description: outputs a single audio stream from a given file
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include "ges-utils.h"
#include "ges-internal.h"
#include "ges-track-element.h"
#include "ges-uri-source.h"
#include "ges-audio-uri-source.h"
#include "ges-uri-asset.h"
#include "ges-extractable.h"
struct _GESAudioUriSourcePrivate
{
GESUriSource parent;
};
enum
{
PROP_0,
PROP_URI
};
/* GESSource VMethod */
static GstElement *
ges_audio_uri_source_create_source (GESSource * element)
{
return ges_uri_source_create_source (GES_AUDIO_URI_SOURCE (element)->priv);
}
/* Extractable interface implementation */
static gchar *
ges_extractable_check_id (GType type, const gchar * id, GError ** error)
{
return g_strdup (id);
}
static void
ges_extractable_interface_init (GESExtractableInterface * iface)
{
iface->asset_type = GES_TYPE_URI_SOURCE_ASSET;
iface->check_id = ges_extractable_check_id;
}
G_DEFINE_TYPE_WITH_CODE (GESAudioUriSource, ges_audio_uri_source,
GES_TYPE_AUDIO_SOURCE, G_ADD_PRIVATE (GESAudioUriSource)
G_IMPLEMENT_INTERFACE (GES_TYPE_EXTRACTABLE,
ges_extractable_interface_init));
/* GObject VMethods */
static gboolean
_get_natural_framerate (GESTimelineElement * self, gint * framerate_n,
gint * framerate_d)
{
if (self->parent)
return ges_timeline_element_get_natural_framerate (self->parent,
framerate_n, framerate_d);
return FALSE;
}
static void
ges_audio_uri_source_get_property (GObject * object, guint property_id,
GValue * value, GParamSpec * pspec)
{
GESAudioUriSource *uriclip = GES_AUDIO_URI_SOURCE (object);
switch (property_id) {
case PROP_URI:
g_value_set_string (value, uriclip->uri);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec);
}
}
static void
ges_audio_uri_source_set_property (GObject * object, guint property_id,
const GValue * value, GParamSpec * pspec)
{
GESAudioUriSource *uriclip = GES_AUDIO_URI_SOURCE (object);
switch (property_id) {
case PROP_URI:
if (uriclip->uri) {
GST_WARNING_OBJECT (object, "Uri already set to %s", uriclip->uri);
return;
}
uriclip->priv->uri = uriclip->uri = g_value_dup_string (value);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec);
}
}
static void
ges_audio_uri_source_finalize (GObject * object)
{
GESAudioUriSource *uriclip = GES_AUDIO_URI_SOURCE (object);
g_free (uriclip->uri);
G_OBJECT_CLASS (ges_audio_uri_source_parent_class)->finalize (object);
}
static void
ges_audio_uri_source_class_init (GESAudioUriSourceClass * klass)
{
GObjectClass *object_class = G_OBJECT_CLASS (klass);
GESTimelineElementClass *element_class = GES_TIMELINE_ELEMENT_CLASS (klass);
GESSourceClass *src_class = GES_SOURCE_CLASS (klass);
object_class->get_property = ges_audio_uri_source_get_property;
object_class->set_property = ges_audio_uri_source_set_property;
object_class->finalize = ges_audio_uri_source_finalize;
/**
* GESAudioUriSource:uri:
*
* The location of the file/resource to use.
*/
g_object_class_install_property (object_class, PROP_URI,
g_param_spec_string ("uri", "URI", "uri of the resource",
NULL, G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY));
element_class->get_natural_framerate = _get_natural_framerate;
src_class->select_pad = ges_uri_source_select_pad;
src_class->create_source = ges_audio_uri_source_create_source;
}
static void
ges_audio_uri_source_init (GESAudioUriSource * self)
{
self->priv = ges_audio_uri_source_get_instance_private (self);
ges_uri_source_init (GES_TRACK_ELEMENT (self), self->priv);
}
/**
* ges_audio_uri_source_new:
* @uri: the URI the source should control
*
* Creates a new #GESAudioUriSource for the provided @uri.
*
* Returns: (transfer floating) (nullable): The newly created
* #GESAudioUriSource, or %NULL if there was an error.
*/
GESAudioUriSource *
ges_audio_uri_source_new (gchar * uri)
{
return g_object_new (GES_TYPE_AUDIO_URI_SOURCE, "uri", uri, NULL);
}

View file

@ -0,0 +1,60 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include <glib-object.h>
#include <ges/ges-types.h>
#include <ges/ges-audio-source.h>
G_BEGIN_DECLS
typedef struct _GESUriSource GESUriSource;
#define GES_TYPE_AUDIO_URI_SOURCE ges_audio_uri_source_get_type()
GES_DECLARE_TYPE(AudioUriSource, audio_uri_source, AUDIO_URI_SOURCE);
/**
* GESAudioUriSource:
*
* ### Children Properties
*
* {{ libs/GESVideoUriSource-children-props.md }}
*/
struct _GESAudioUriSource {
/*< private >*/
GESAudioSource parent;
gchar *uri;
GESUriSource *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
struct _GESAudioUriSourceClass {
/*< private >*/
GESAudioSourceClass parent_class;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
G_END_DECLS

222
ges/ges-auto-transition.c Normal file
View file

@ -0,0 +1,222 @@
/* -*- Mode: C; indent-tabs-mode: nil; c-basic-offset: 2; tab-width: 2 -*- */
/*
* gst-editing-services
* Copyright (C) 2013 Thibault Saunier <thibault.saunier@collabora.com>
*
* gst-editing-services is free software: you can redistribute it and/or modify it
* under the terms of the GNU Lesser General Public License as published
* by the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* gst-editing-services is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
* See the GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.";
*/
/* This class warps a GESBaseTransitionClip, letting any implementation
* of a GESBaseTransitionClip to be used.
*
* NOTE: This is for internal use exclusively
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include "ges-auto-transition.h"
#include "ges-internal.h"
enum
{
DESTROY_ME,
LAST_SIGNAL
};
static guint auto_transition_signals[LAST_SIGNAL] = { 0 };
G_DEFINE_TYPE (GESAutoTransition, ges_auto_transition, G_TYPE_OBJECT);
static void
neighbour_changed_cb (G_GNUC_UNUSED GObject * object,
G_GNUC_UNUSED GParamSpec * arg, GESAutoTransition * self)
{
gint64 new_duration;
guint32 layer_prio;
GESLayer *layer;
GESTimeline *timeline;
if (self->frozen) {
GST_LOG_OBJECT (self, "Not updating because frozen");
return;
}
if (self->positioning) {
/* this can happen when the transition is moved layers as the layer
* may resync its priorities */
GST_LOG_OBJECT (self, "Not updating because positioning");
return;
}
layer_prio = GES_TIMELINE_ELEMENT_LAYER_PRIORITY (self->next_source);
if (layer_prio != GES_TIMELINE_ELEMENT_LAYER_PRIORITY (self->previous_source)) {
GST_DEBUG_OBJECT (self, "Destroy changed layer");
g_signal_emit (self, auto_transition_signals[DESTROY_ME], 0);
return;
}
new_duration =
(_START (self->previous_source) +
_DURATION (self->previous_source)) - _START (self->next_source);
if (new_duration <= 0 || new_duration >= _DURATION (self->previous_source)
|| new_duration >= _DURATION (self->next_source)) {
GST_DEBUG_OBJECT (self, "Destroy %" G_GINT64_FORMAT " not a valid duration",
new_duration);
g_signal_emit (self, auto_transition_signals[DESTROY_ME], 0);
return;
}
timeline = GES_TIMELINE_ELEMENT_TIMELINE (self->transition_clip);
layer = timeline ? ges_timeline_get_layer (timeline, layer_prio) : NULL;
if (!layer) {
GST_DEBUG_OBJECT (self, "Destroy no layer");
g_signal_emit (self, auto_transition_signals[DESTROY_ME], 0);
return;
}
self->positioning = TRUE;
GES_TIMELINE_ELEMENT_SET_BEING_EDITED (self->transition_clip);
_set_start0 (GES_TIMELINE_ELEMENT (self->transition_clip),
_START (self->next_source));
_set_duration0 (GES_TIMELINE_ELEMENT (self->transition_clip), new_duration);
ges_clip_move_to_layer (self->transition_clip, layer);
GES_TIMELINE_ELEMENT_UNSET_BEING_EDITED (self->transition_clip);
self->positioning = FALSE;
gst_object_unref (layer);
}
static void
_track_changed_cb (GESTrackElement * track_element,
GParamSpec * arg G_GNUC_UNUSED, GESAutoTransition * self)
{
if (self->frozen) {
GST_LOG_OBJECT (self, "Not updating because frozen");
return;
}
if (ges_track_element_get_track (track_element) == NULL) {
GST_DEBUG_OBJECT (self, "Neighboor %" GST_PTR_FORMAT
" removed from track ... auto destructing", track_element);
g_signal_emit (self, auto_transition_signals[DESTROY_ME], 0);
}
}
static void
_connect_to_source (GESAutoTransition * self, GESTrackElement * source)
{
g_signal_connect (source, "notify::start",
G_CALLBACK (neighbour_changed_cb), self);
g_signal_connect_after (source, "notify::priority",
G_CALLBACK (neighbour_changed_cb), self);
g_signal_connect (source, "notify::duration",
G_CALLBACK (neighbour_changed_cb), self);
g_signal_connect (source, "notify::track",
G_CALLBACK (_track_changed_cb), self);
}
static void
_disconnect_from_source (GESAutoTransition * self, GESTrackElement * source)
{
g_signal_handlers_disconnect_by_func (source, neighbour_changed_cb, self);
g_signal_handlers_disconnect_by_func (source, _track_changed_cb, self);
}
void
ges_auto_transition_set_source (GESAutoTransition * self,
GESTrackElement * source, GESEdge edge)
{
_disconnect_from_source (self, self->previous_source);
_connect_to_source (self, source);
if (edge == GES_EDGE_END)
self->next_source = source;
else
self->previous_source = source;
}
static void
ges_auto_transition_init (GESAutoTransition * ges_auto_transition)
{
}
static void
ges_auto_transition_finalize (GObject * object)
{
GESAutoTransition *self = GES_AUTO_TRANSITION (object);
_disconnect_from_source (self, self->previous_source);
_disconnect_from_source (self, self->next_source);
G_OBJECT_CLASS (ges_auto_transition_parent_class)->finalize (object);
}
static void
ges_auto_transition_class_init (GESAutoTransitionClass * klass)
{
GObjectClass *object_class = G_OBJECT_CLASS (klass);
auto_transition_signals[DESTROY_ME] =
g_signal_new ("destroy-me", G_TYPE_FROM_CLASS (klass),
G_SIGNAL_RUN_FIRST | G_SIGNAL_NO_RECURSE, 0, NULL, NULL, NULL,
G_TYPE_NONE, 0);
object_class->finalize = ges_auto_transition_finalize;
}
GESAutoTransition *
ges_auto_transition_new (GESTrackElement * transition,
GESTrackElement * previous_source, GESTrackElement * next_source)
{
GESAutoTransition *self = g_object_new (GES_TYPE_AUTO_TRANSITION, NULL);
self->frozen = FALSE;
self->previous_source = previous_source;
self->next_source = next_source;
self->transition = transition;
self->transition_clip = GES_CLIP (GES_TIMELINE_ELEMENT_PARENT (transition));
_connect_to_source (self, previous_source);
_connect_to_source (self, next_source);
GST_DEBUG_OBJECT (self, "Created transition %" GST_PTR_FORMAT
" between %" GST_PTR_FORMAT "[%" GST_TIME_FORMAT
" - %" GST_TIME_FORMAT "] and: %" GST_PTR_FORMAT
"[%" GST_TIME_FORMAT " - %" GST_TIME_FORMAT "]"
" in layer nb %" G_GUINT32_FORMAT ", start: %" GST_TIME_FORMAT
" duration: %" GST_TIME_FORMAT, transition, previous_source,
GST_TIME_ARGS (_START (previous_source)),
GST_TIME_ARGS (_END (previous_source)),
next_source,
GST_TIME_ARGS (_START (next_source)),
GST_TIME_ARGS (_END (next_source)),
GES_TIMELINE_ELEMENT_LAYER_PRIORITY (next_source),
GST_TIME_ARGS (_START (transition)),
GST_TIME_ARGS (_DURATION (transition)));
return self;
}
void
ges_auto_transition_update (GESAutoTransition * self)
{
GST_INFO ("Updating info %s",
GES_TIMELINE_ELEMENT_NAME (self->transition_clip));
neighbour_changed_cb (NULL, NULL, self);
}

66
ges/ges-auto-transition.h Normal file
View file

@ -0,0 +1,66 @@
/* -*- Mode: C; indent-tabs-mode: nil; c-basic-offset: 2; tab-width: 2 -*- */
/*
* gst-editing-services
* Copyright (C) 2013 Thibault Saunier <thibault.saunier@collabora.com>
*
* gst-editing-services is free software: you can redistribute it and/or modify it
* under the terms of the GNU Lesser General Public License as published
* by the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* gst-editing-services is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
* See the GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.";
*/
#pragma once
#include <glib-object.h>
#include "ges-track-element.h"
#include "ges-clip.h"
#include "ges-layer.h"
G_BEGIN_DECLS
#define GES_TYPE_AUTO_TRANSITION (ges_auto_transition_get_type ())
typedef struct _GESAutoTransitionClass GESAutoTransitionClass;
typedef struct _GESAutoTransition GESAutoTransition;
GES_DECLARE_TYPE(AutoTransition, auto_transition, AUTO_TRANSITION);
struct _GESAutoTransitionClass
{
GObjectClass parent_class;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
struct _GESAutoTransition
{
GObject parent_instance;
/* <read only and construct only> */
GESTrackElement *previous_source;
GESTrackElement *next_source;
GESTrackElement *transition;
GESClip *transition_clip;
gboolean positioning;
gboolean frozen;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
G_GNUC_INTERNAL void ges_auto_transition_update (GESAutoTransition *self);
G_GNUC_INTERNAL GESAutoTransition * ges_auto_transition_new (GESTrackElement * transition,
GESTrackElement * previous_source,
GESTrackElement * next_source);
G_END_DECLS

View file

@ -0,0 +1,85 @@
/* GStreamer Editing Services
* Copyright (C) 2011 Thibault Saunier <thibault.saunier@collabora.co.uk>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
/**
* SECTION: gesbaseeffectclip
* @title: GESBaseEffectClip
* @short_description: An effect in a #GESLayer
*
* #GESBaseEffectClip-s are clips whose core elements are
* #GESBaseEffect-s.
*
* ## Effects
*
* #GESBaseEffectClip-s can have **additional** #GESBaseEffect-s added as
* non-core elements. These additional effects are applied to the output
* of the core effects of the clip that they share a #GESTrack with. See
* #GESClip for how to add and move these effects from the clip.
*
* Note that you cannot add time effects to #GESBaseEffectClip, neither
* as core children, nor as additional effects.
*/
/* FIXME: properly handle the priority of the children. How should we sort
* the priority of effects when two #GESBaseEffectClip's overlap? */
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include <ges/ges.h>
#include "ges-internal.h"
#include "ges-types.h"
struct _GESBaseEffectClipPrivate
{
void *nothing;
};
G_DEFINE_ABSTRACT_TYPE_WITH_PRIVATE (GESBaseEffectClip, ges_base_effect_clip,
GES_TYPE_OPERATION_CLIP);
static gboolean
ges_base_effect_clip_add_child (GESContainer * container,
GESTimelineElement * element)
{
if (GES_IS_TIME_EFFECT (element)) {
GST_WARNING_OBJECT (container, "Cannot add %" GES_FORMAT " as a child "
"because it is a time effect", GES_ARGS (element));
return FALSE;
}
return
GES_CONTAINER_CLASS (ges_base_effect_clip_parent_class)->add_child
(container, element);
}
static void
ges_base_effect_clip_class_init (GESBaseEffectClipClass * klass)
{
GESContainerClass *container_class = GES_CONTAINER_CLASS (klass);
GES_CLIP_CLASS_CAN_ADD_EFFECTS (klass) = TRUE;
container_class->add_child = ges_base_effect_clip_add_child;
}
static void
ges_base_effect_clip_init (GESBaseEffectClip * self)
{
self->priv = ges_base_effect_clip_get_instance_private (self);
}

View file

@ -0,0 +1,56 @@
/* GStreamer Editing Services
* Copyright (C) 2011 Thibault Saunier <thibault.saunier@collabora.co.uk>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include <glib-object.h>
#include <ges/ges-types.h>
G_BEGIN_DECLS
#define GES_TYPE_BASE_EFFECT_CLIP ges_base_effect_clip_get_type()
GES_DECLARE_TYPE(BaseEffectClip, base_effect_clip, BASE_EFFECT_CLIP);
/**
* GESBaseEffectClip:
*/
struct _GESBaseEffectClip {
/*< private >*/
GESOperationClip parent;
GESBaseEffectClipPrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
/**
* GESBaseEffectClipClass:
*
*/
struct _GESBaseEffectClipClass {
/*< private >*/
GESOperationClipClass parent_class;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
G_END_DECLS

429
ges/ges-base-effect.c Normal file
View file

@ -0,0 +1,429 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Thibault Saunier <tsaunier@gnome.org>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
/**
* SECTION:gesbaseeffect
* @title: GESBaseEffect
* @short_description: adds an effect to a stream in a GESSourceClip or a
* GESLayer
*
* A #GESBaseEffect is some operation that applies an effect to the data
* it receives.
*
* ## Time Effects
*
* Some operations will change the timing of the stream data they receive
* in some way. In particular, the #GstElement that they wrap could alter
* the times of the segment they receive in a #GST_EVENT_SEGMENT event,
* or the times of a seek they receive in a #GST_EVENT_SEEK event. Such
* operations would be considered time effects since they translate the
* times they receive on their source to different times at their sink,
* and vis versa. This introduces two sets of time coordinates for the
* event: (internal) sink coordinates and (internal) source coordinates,
* where segment times are translated from the sink coordinates to the
* source coordinates, and seek times are translated from the source
* coordinates to the sink coordinates.
*
* If you use such an effect in GES, you will need to inform GES of the
* properties that control the timing with
* ges_base_effect_register_time_property(), and the effect's timing
* behaviour using ges_base_effect_set_time_translation_funcs().
*
* Note that a time effect should not have its
* #GESTrackElement:has-internal-source set to %TRUE.
*
* In addition, note that GES only *fully* supports time effects whose
* mapping from the source to sink coordinates (those applied to seeks)
* obeys:
*
* + Maps the time `0` to `0`. So initial time-shifting effects are
* excluded.
* + Is monotonically increasing. So reversing effects, and effects that
* jump backwards in the stream are excluded.
* + Can handle a reasonable #GstClockTime, relative to the project. So
* this would exclude a time effect with an extremely large speed-up
* that would cause the converted #GstClockTime seeks to overflow.
* + Is 'continuously reversible'. This essentially means that for every
* time in the sink coordinates, we can, to 'good enough' accuracy,
* calculate the corresponding time in the source coordinates. Moreover,
* this should correspond to how segment times are translated from
* sink to source.
* + Only depends on the registered time properties, rather than the
* state of the #GstElement or the data it receives. This would exclude,
* say, an effect that would speedup if there is more red in the image
* it receives.
*
* Note that a constant-rate-change effect that is not extremely fast or
* slow would satisfy these conditions. For such effects, you may wish to
* use ges_effect_class_register_rate_property().
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include <glib/gprintf.h>
#include "ges-utils.h"
#include "ges-internal.h"
#include "ges-track-element.h"
#include "ges-base-effect.h"
typedef struct _TimePropertyData
{
gchar *property_name;
GObject *child;
GParamSpec *pspec;
} TimePropertyData;
static void
_time_property_data_free (gpointer data_p)
{
TimePropertyData *data = data_p;
g_free (data->property_name);
gst_object_unref (data->child);
g_param_spec_unref (data->pspec);
g_free (data);
}
struct _GESBaseEffectPrivate
{
GList *time_properties;
GESBaseEffectTimeTranslationFunc source_to_sink;
GESBaseEffectTimeTranslationFunc sink_to_source;
gpointer translation_data;
GDestroyNotify destroy_translation_data;
};
G_DEFINE_ABSTRACT_TYPE_WITH_PRIVATE (GESBaseEffect, ges_base_effect,
GES_TYPE_OPERATION);
static gboolean
ges_base_effect_set_child_property_full (GESTimelineElement * element,
GObject * child, GParamSpec * pspec, const GValue * value, GError ** error)
{
GESClip *parent = GES_IS_CLIP (element->parent) ?
GES_CLIP (element->parent) : NULL;
if (parent && !ges_clip_can_set_time_property_of_child (parent,
GES_TRACK_ELEMENT (element), child, pspec, value, error)) {
GST_INFO_OBJECT (element, "Cannot set time property '%s::%s' "
"because the parent clip %" GES_FORMAT " would not allow it",
G_OBJECT_TYPE_NAME (child), pspec->name, GES_ARGS (parent));
return FALSE;
}
return
GES_TIMELINE_ELEMENT_CLASS
(ges_base_effect_parent_class)->set_child_property_full (element, child,
pspec, value, error);
}
static void
ges_base_effect_dispose (GObject * object)
{
GESBaseEffectPrivate *priv = GES_BASE_EFFECT (object)->priv;
g_list_free_full (priv->time_properties, _time_property_data_free);
priv->time_properties = NULL;
if (priv->destroy_translation_data)
priv->destroy_translation_data (priv->translation_data);
priv->destroy_translation_data = NULL;
priv->source_to_sink = NULL;
priv->sink_to_source = NULL;
G_OBJECT_CLASS (ges_base_effect_parent_class)->dispose (object);
}
static void
ges_base_effect_class_init (GESBaseEffectClass * klass)
{
GObjectClass *object_class = G_OBJECT_CLASS (klass);
GESTimelineElementClass *element_class = GES_TIMELINE_ELEMENT_CLASS (klass);
object_class->dispose = ges_base_effect_dispose;
element_class->set_child_property_full =
ges_base_effect_set_child_property_full;
}
static void
ges_base_effect_init (GESBaseEffect * self)
{
self->priv = ges_base_effect_get_instance_private (self);
}
static void
_child_property_removed (GESTimelineElement * element, GObject * child,
GParamSpec * pspec, gpointer user_data)
{
GList *tmp;
GESBaseEffectPrivate *priv = GES_BASE_EFFECT (element)->priv;
for (tmp = priv->time_properties; tmp; tmp = tmp->next) {
TimePropertyData *data = tmp->data;
if (data->child == child && data->pspec == pspec) {
priv->time_properties = g_list_remove (priv->time_properties, data);
_time_property_data_free (data);
return;
}
}
}
/**
* ges_base_effect_register_time_property:
* @effect: A #GESBaseEffect
* @child_property_name: The name of the child property to register as
* a time property
*
* Register a child property of the effect as a property that, when set,
* can change the timing of its input data. The child property should be
* specified as in ges_timeline_element_lookup_child().
*
* You should also set the corresponding time translation using
* ges_base_effect_set_time_translation_funcs().
*
* Note that @effect must not be part of a clip, nor can it have
* #GESTrackElement:has-internal-source set to %TRUE.
*
* Returns: %TRUE if the child property was found and newly registered.
* Since: 1.18
*/
gboolean
ges_base_effect_register_time_property (GESBaseEffect * effect,
const gchar * child_property_name)
{
GESTimelineElement *element;
GESTrackElement *el;
GParamSpec *pspec;
GObject *child;
GList *tmp;
TimePropertyData *data;
g_return_val_if_fail (GES_IS_BASE_EFFECT (effect), FALSE);
el = GES_TRACK_ELEMENT (effect);
element = GES_TIMELINE_ELEMENT (el);
g_return_val_if_fail (element->parent == NULL, FALSE);
g_return_val_if_fail (ges_track_element_has_internal_source (el) == FALSE,
FALSE);
if (!ges_timeline_element_lookup_child (element, child_property_name,
&child, &pspec))
return FALSE;
for (tmp = effect->priv->time_properties; tmp; tmp = tmp->next) {
data = tmp->data;
if (data->child == child && data->pspec == pspec) {
GST_WARNING_OBJECT (effect, "Already registered the time effect for %s",
child_property_name);
g_object_unref (child);
g_param_spec_unref (pspec);
return FALSE;
}
}
ges_track_element_set_has_internal_source_is_forbidden (el);
data = g_new0 (TimePropertyData, 1);
data->child = child;
data->pspec = pspec;
data->property_name = g_strdup (child_property_name);
effect->priv->time_properties =
g_list_prepend (effect->priv->time_properties, data);
g_signal_handlers_disconnect_by_func (effect, _child_property_removed, NULL);
g_signal_connect (effect, "child-property-removed",
G_CALLBACK (_child_property_removed), NULL);
return TRUE;
}
/**
* ges_base_effect_set_time_translation_funcs:
* @effect: A #GESBaseEffect
* @source_to_sink_func: (nullable) (scope notified): The function to use
* for querying how a time is translated from the source coordinates to
* the sink coordinates of @effect
* @sink_to_source_func: (nullable) (scope notified): The function to use
* for querying how a time is translated from the sink coordinates to the
* source coordinates of @effect
* @user_data: (closure): Data to pass to both @source_to_sink_func and
* @sink_to_source_func
* @destroy: (destroy user_data) (nullable): Method to call to destroy
* @user_data, or %NULL
*
* Set the time translation query functions for the time effect. If an
* effect is a time effect, it will have two sets of coordinates: one
* at its sink and one at its source. The given functions should be able
* to translate between these two sets of coordinates. More specifically,
* @source_to_sink_func should *emulate* how the corresponding #GstElement
* would translate the #GstSegment @time field, and @sink_to_source_func
* should emulate how the corresponding #GstElement would translate the
* seek query @start and @stop values, as used in gst_element_seek(). As
* such, @sink_to_source_func should act as an approximate reverse of
* @source_to_sink_func.
*
* Note, these functions will be passed a table of time properties, as
* registered in ges_base_effect_register_time_property(), and their
* values. The functions should emulate what the translation *would* be
* *if* the time properties were set to the given values. They should not
* use the currently set values.
*
* Note that @effect must not be part of a clip, nor can it have
* #GESTrackElement:has-internal-source set to %TRUE.
*
* Returns: %TRUE if the translation functions were set.
* Since: 1.18
*/
gboolean
ges_base_effect_set_time_translation_funcs (GESBaseEffect * effect,
GESBaseEffectTimeTranslationFunc source_to_sink_func,
GESBaseEffectTimeTranslationFunc sink_to_source_func,
gpointer user_data, GDestroyNotify destroy)
{
GESTimelineElement *element;
GESTrackElement *el;
GESBaseEffectPrivate *priv;
g_return_val_if_fail (GES_IS_BASE_EFFECT (effect), FALSE);
element = GES_TIMELINE_ELEMENT (effect);
el = GES_TRACK_ELEMENT (element);
g_return_val_if_fail (element->parent == NULL, FALSE);
g_return_val_if_fail (ges_track_element_has_internal_source (el) == FALSE,
FALSE);
ges_track_element_set_has_internal_source_is_forbidden (el);
priv = effect->priv;
if (priv->destroy_translation_data)
priv->destroy_translation_data (priv->translation_data);
priv->translation_data = user_data;
priv->destroy_translation_data = destroy;
priv->source_to_sink = source_to_sink_func;
priv->sink_to_source = sink_to_source_func;
return TRUE;
}
/**
* ges_base_effect_is_time_effect:
* @effect: A #GESBaseEffect
*
* Get whether the effect is considered a time effect or not. An effect
* with registered time properties or set translation functions is
* considered a time effect.
*
* Returns: %TRUE if @effect is considered a time effect.
* Since: 1.18
*/
gboolean
ges_base_effect_is_time_effect (GESBaseEffect * effect)
{
GESBaseEffectPrivate *priv;
g_return_val_if_fail (GES_IS_BASE_EFFECT (effect), FALSE);
priv = effect->priv;
if (priv->time_properties || priv->source_to_sink || priv->sink_to_source)
return TRUE;
return FALSE;
}
gchar *
ges_base_effect_get_time_property_name (GESBaseEffect * effect,
GObject * child, GParamSpec * pspec)
{
GList *tmp;
for (tmp = effect->priv->time_properties; tmp; tmp = tmp->next) {
TimePropertyData *data = tmp->data;
if (data->pspec == pspec && data->child == child)
return g_strdup (data->property_name);
}
return NULL;
}
static void
_gvalue_free (gpointer data)
{
GValue *val = data;
g_value_unset (val);
g_free (val);
}
GHashTable *
ges_base_effect_get_time_property_values (GESBaseEffect * effect)
{
GList *tmp;
GHashTable *ret =
g_hash_table_new_full (g_str_hash, g_str_equal, g_free, _gvalue_free);
for (tmp = effect->priv->time_properties; tmp; tmp = tmp->next) {
TimePropertyData *data = tmp->data;
GValue *value = g_new0 (GValue, 1);
/* FIXME: once we move to GLib 2.60, g_object_get_property() will
* automatically initialize the type */
g_value_init (value, data->pspec->value_type);
g_object_get_property (data->child, data->pspec->name, value);
g_hash_table_insert (ret, g_strdup (data->property_name), value);
}
return ret;
}
GstClockTime
ges_base_effect_translate_source_to_sink_time (GESBaseEffect * effect,
GstClockTime time, GHashTable * time_property_values)
{
GESBaseEffectPrivate *priv = effect->priv;
if (!GST_CLOCK_TIME_IS_VALID (time))
return GST_CLOCK_TIME_NONE;
if (priv->source_to_sink)
return priv->source_to_sink (effect, time, time_property_values,
priv->translation_data);
if (time_property_values && g_hash_table_size (time_property_values))
GST_ERROR_OBJECT (effect, "The time effect is missing its source to "
"sink translation function");
return time;
}
GstClockTime
ges_base_effect_translate_sink_to_source_time (GESBaseEffect * effect,
GstClockTime time, GHashTable * time_property_values)
{
GESBaseEffectPrivate *priv = effect->priv;
if (!GST_CLOCK_TIME_IS_VALID (time))
return GST_CLOCK_TIME_NONE;
if (priv->sink_to_source)
return effect->priv->sink_to_source (effect, time, time_property_values,
priv->translation_data);
if (time_property_values && g_hash_table_size (time_property_values))
GST_ERROR_OBJECT (effect, "The time effect is missing its sink to "
"source translation function");
return time;
}

94
ges/ges-base-effect.h Normal file
View file

@ -0,0 +1,94 @@
/* GStreamer Editing Services
* Copyright (C) 2010 Thibault Saunier <thibault.saunier@collabora.co.uk>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include <glib-object.h>
#include <ges/ges-types.h>
#include <ges/ges-operation.h>
G_BEGIN_DECLS
#define GES_TYPE_BASE_EFFECT ges_base_effect_get_type()
GES_DECLARE_TYPE(BaseEffect, base_effect, BASE_EFFECT);
/**
* GESBaseEffect:
*/
struct _GESBaseEffect
{
/*< private > */
GESOperation parent;
GESBaseEffectPrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
/**
* GESBaseEffectClass:
* @parent_class: parent class
*/
struct _GESBaseEffectClass
{
/*< private > */
GESOperationClass parent_class;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
/**
* GESBaseEffectTimeTranslationFunc:
* @effect: The #GESBaseEffect that is doing the time translation
* @time: The #GstClockTime to translation
* @time_property_values: (element-type gchar* GValue*): A table of child
* property name/value pairs
* @user_data: Data passed to ges_base_effect_set_time_translation_funcs()
*
* A function for querying how an effect would translate a time if it had
* the given child property values set. The keys for @time_properties will
* be the same string that was passed to
* ges_base_effect_register_time_property(), the values will be #GValue*
* values of the corresponding child properties. You should always use the
* values given in @time_properties before using the currently set values.
*
* Returns: The translated time.
* Since: 1.18
*/
typedef GstClockTime (*GESBaseEffectTimeTranslationFunc) (GESBaseEffect * effect,
GstClockTime time,
GHashTable * time_property_values,
gpointer user_data);
GES_API gboolean
ges_base_effect_register_time_property (GESBaseEffect * effect,
const gchar * child_property_name);
GES_API gboolean
ges_base_effect_set_time_translation_funcs (GESBaseEffect * effect,
GESBaseEffectTimeTranslationFunc source_to_sink_func,
GESBaseEffectTimeTranslationFunc sink_to_source_func,
gpointer user_data,
GDestroyNotify destroy);
GES_API gboolean
ges_base_effect_is_time_effect (GESBaseEffect * effect);
G_END_DECLS

View file

@ -0,0 +1,51 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
/**
* SECTION: gesbasetransitionclip
* @title: GESBaseTransitionClip
* @short_description: Base classes for transitions
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include <ges/ges.h>
#include "ges-internal.h"
struct _GESBaseTransitionClipPrivate
{
/* Dummy variable */
void *nothing;
};
G_DEFINE_ABSTRACT_TYPE_WITH_PRIVATE (GESBaseTransitionClip,
ges_base_transition_clip, GES_TYPE_OPERATION_CLIP);
static void
ges_base_transition_clip_class_init (GESBaseTransitionClipClass * klass)
{
}
static void
ges_base_transition_clip_init (GESBaseTransitionClip * self)
{
self->priv = ges_base_transition_clip_get_instance_private (self);
}

View file

@ -0,0 +1,60 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include "ges-operation-clip.h"
#include <glib-object.h>
#include <ges/ges-types.h>
G_BEGIN_DECLS
#define GES_TYPE_BASE_TRANSITION_CLIP ges_base_transition_clip_get_type()
GES_DECLARE_TYPE(BaseTransitionClip, base_transition_clip, BASE_TRANSITION_CLIP);
/**
* GESBaseTransitionClip:
*/
struct _GESBaseTransitionClip {
/*< private >*/
GESOperationClip parent;
/*< private >*/
GESBaseTransitionClipPrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
/**
* GESBaseTransitionClipClass:
*
*/
struct _GESBaseTransitionClipClass {
/*< private >*/
GESOperationClipClass parent_class;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING];
};
G_END_DECLS

1353
ges/ges-base-xml-formatter.c Normal file

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,62 @@
/* Gstreamer Editing Services
*
* Copyright (C) <2012> Thibault Saunier <thibault.saunier@collabora.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 59 Temple Place - Suite 330,
* Boston, MA 02111-1307, USA.
*/
#include "ges-formatter.h"
#pragma once
G_BEGIN_DECLS
typedef struct _GESBaseXmlFormatter GESBaseXmlFormatter;
typedef struct _GESBaseXmlFormatterClass GESBaseXmlFormatterClass;
#define GES_TYPE_BASE_XML_FORMATTER (ges_base_xml_formatter_get_type ())
GES_DECLARE_TYPE(BaseXmlFormatter, base_xml_formatter, BASE_XML_FORMATTER);
/**
* GESBaseXmlFormatter:
*/
struct _GESBaseXmlFormatter
{
GESFormatter parent;
/*< public > */
/* <private> */
GESBaseXmlFormatterPrivate *priv;
gchar *xmlcontent;
gpointer _ges_reserved[GES_PADDING - 1];
};
/**
* GESBaseXmlFormatterClass:
*/
struct _GESBaseXmlFormatterClass
{
GESFormatterClass parent;
/* Should be overriden by subclasses */
GMarkupParser content_parser;
GString * (*save) (GESFormatter *formatter, GESTimeline *timeline, GError **error);
gpointer _ges_reserved[GES_PADDING];
};
G_END_DECLS

238
ges/ges-clip-asset.c Normal file
View file

@ -0,0 +1,238 @@
/* Gstreamer Editing Services
*
* Copyright (C) <2011> Thibault Saunier <thibault.saunier@collabora.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 59 Temple Place - Suite 330,
* Boston, MA 02111-1307, USA.
*/
/**
* SECTION: gesclipasset
* @title: GESClipAsset
* @short_description: A GESAsset subclass specialized in GESClip extraction
*
* The #GESUriClipAsset is a special #GESAsset specilized in #GESClip.
* it is mostly used to get information about the #GESTrackType-s the objects extracted
* from it can potentialy create #GESTrackElement for.
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include "ges-clip-asset.h"
#include "ges-source-clip.h"
#include "ges-internal.h"
#define GES_CLIP_ASSET_GET_PRIVATE(o)\
(G_TYPE_INSTANCE_GET_PRIVATE ((o), GES_TYPE_CLIP_ASSET, \
GESClipAssetPrivate))
#define parent_class ges_clip_asset_parent_class
struct _GESClipAssetPrivate
{
GESTrackType supportedformats;
};
enum
{
PROP_0,
PROP_SUPPORTED_FORMATS,
PROP_LAST
};
static GParamSpec *properties[PROP_LAST];
G_DEFINE_TYPE_WITH_PRIVATE (GESClipAsset, ges_clip_asset, GES_TYPE_ASSET);
/***********************************************
* *
* GObject vmetods implemenation *
* *
***********************************************/
static void
_get_property (GObject * object, guint property_id,
GValue * value, GParamSpec * pspec)
{
GESClipAssetPrivate *priv = GES_CLIP_ASSET (object)->priv;
switch (property_id) {
case PROP_SUPPORTED_FORMATS:
g_value_set_flags (value, priv->supportedformats);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec);
}
}
static void
_set_property (GObject * object, guint property_id,
const GValue * value, GParamSpec * pspec)
{
GESClipAssetPrivate *priv = GES_CLIP_ASSET (object)->priv;
switch (property_id) {
case PROP_SUPPORTED_FORMATS:
priv->supportedformats = g_value_get_flags (value);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, property_id, pspec);
}
}
static void
ges_clip_asset_init (GESClipAsset * self)
{
self->priv = ges_clip_asset_get_instance_private (self);
}
static void
_constructed (GObject * object)
{
GType extractable_type = ges_asset_get_extractable_type (GES_ASSET (object));
GObjectClass *class = g_type_class_ref (extractable_type);
GParamSpecFlags *pspec;
pspec = G_PARAM_SPEC_FLAGS (g_object_class_find_property (class,
"supported-formats"));
GES_CLIP_ASSET (object)->priv->supportedformats = pspec->default_value;
g_type_class_unref (class);
G_OBJECT_CLASS (parent_class)->constructed (object);
}
static void
ges_clip_asset_class_init (GESClipAssetClass * self_class)
{
GObjectClass *object_class = G_OBJECT_CLASS (self_class);
object_class->constructed = _constructed;
object_class->get_property = _get_property;
object_class->set_property = _set_property;
/**
* GESClipAsset:supported-formats:
*
* The formats supported by the asset.
*/
properties[PROP_SUPPORTED_FORMATS] = g_param_spec_flags ("supported-formats",
"Supported formats", "Formats supported by the file",
GES_TYPE_TRACK_TYPE, GES_TRACK_TYPE_AUDIO | GES_TRACK_TYPE_VIDEO,
G_PARAM_READWRITE | G_PARAM_CONSTRUCT);
g_object_class_install_property (object_class, PROP_SUPPORTED_FORMATS,
properties[PROP_SUPPORTED_FORMATS]);
}
/***********************************************
* *
* Public methods *
* *
***********************************************/
/**
* ges_clip_asset_set_supported_formats:
* @self: a #GESClipAsset
* @supportedformats: The track types supported by the GESClipAsset
*
* Sets track types for which objects extracted from @self can create #GESTrackElement
*/
void
ges_clip_asset_set_supported_formats (GESClipAsset * self,
GESTrackType supportedformats)
{
g_return_if_fail (GES_IS_CLIP_ASSET (self));
self->priv->supportedformats = supportedformats;
}
/**
* ges_clip_asset_get_supported_formats:
* @self: a #GESClipAsset
*
* Gets track types for which objects extracted from @self can create #GESTrackElement
*
* Returns: The track types on which @self will create TrackElement when added to
* a layer
*/
GESTrackType
ges_clip_asset_get_supported_formats (GESClipAsset * self)
{
g_return_val_if_fail (GES_IS_CLIP_ASSET (self), GES_TRACK_TYPE_UNKNOWN);
return self->priv->supportedformats;
}
/**
* ges_clip_asset_get_natural_framerate:
* @self: The object from which to retrieve the natural framerate
* @framerate_n: The framerate numerator
* @framerate_d: The framerate denominator
*
* Result: %TRUE if @self has a natural framerate %FALSE otherwise
*
* Since: 1.18
*/
gboolean
ges_clip_asset_get_natural_framerate (GESClipAsset * self,
gint * framerate_n, gint * framerate_d)
{
GESClipAssetClass *klass;
g_return_val_if_fail (GES_IS_CLIP_ASSET (self), FALSE);
g_return_val_if_fail (framerate_n && framerate_d, FALSE);
klass = GES_CLIP_ASSET_GET_CLASS (self);
*framerate_n = 0;
*framerate_d = -1;
if (klass->get_natural_framerate)
return klass->get_natural_framerate (self, framerate_n, framerate_d);
return FALSE;
}
/**
* ges_clip_asset_get_frame_time:
* @self: The object for which to compute timestamp for specifed frame
* @frame_number: The frame number we want the internal time coordinate timestamp of
*
* Converts the given frame number into a timestamp, using the "natural" frame
* rate of the asset.
*
* You can use this to reference a specific frame in a media file and use this
* as, for example, the `in-point` or `max-duration` of a #GESClip.
*
* Returns: The timestamp corresponding to @frame_number in the element source, given
* in internal time coordinates, or #GST_CLOCK_TIME_NONE if the clip asset does not have a
* natural frame rate.
*
* Since: 1.18
*/
GstClockTime
ges_clip_asset_get_frame_time (GESClipAsset * self, GESFrameNumber frame_number)
{
gint fps_n, fps_d;
g_return_val_if_fail (GES_IS_CLIP_ASSET (self), GST_CLOCK_TIME_NONE);
g_return_val_if_fail (GES_FRAME_NUMBER_IS_VALID (frame_number),
GST_CLOCK_TIME_NONE);
if (!ges_clip_asset_get_natural_framerate (self, &fps_n, &fps_d))
return GST_CLOCK_TIME_NONE;
return gst_util_uint64_scale_ceil (frame_number, fps_d * GST_SECOND, fps_n);
}

73
ges/ges-clip-asset.h Normal file
View file

@ -0,0 +1,73 @@
/* GStreamer Editing Services
*
* Copyright (C) 2012 Thibault Saunier <thibault.saunier@collabora.com>
* Copyright (C) 2012 Volodymyr Rudyi <vladimir.rudoy@gmail.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 59 Temple Place - Suite 330,
* Boston, MA 02111-1307, USA.
*/
#pragma once
#include <glib-object.h>
#include <ges/ges-types.h>
#include <ges/ges-asset.h>
G_BEGIN_DECLS
#define GES_TYPE_CLIP_ASSET (ges_clip_asset_get_type ())
GES_DECLARE_TYPE(ClipAsset, clip_asset, CLIP_ASSET);
struct _GESClipAsset
{
GESAsset parent;
/* <private> */
GESClipAssetPrivate *priv;
gpointer _ges_reserved[GES_PADDING];
};
struct _GESClipAssetClass
{
GESAssetClass parent;
/**
* GESClipAssetClass::get_natural_framerate:
* @self: A #GESClipAsset
* @framerate_n: The framerate numerator to retrieve
* @framerate_d: The framerate denominator to retrieve
*
* Returns: %TRUE if @self has a natural framerate @FALSE otherwise.
*
* Since: 1.18
*/
gboolean (*get_natural_framerate) (GESClipAsset *self, gint *framerate_n, gint *framerate_d);
gpointer _ges_reserved[GES_PADDING - 1];
};
GES_API
void ges_clip_asset_set_supported_formats (GESClipAsset *self,
GESTrackType supportedformats);
GES_API
GESTrackType ges_clip_asset_get_supported_formats (GESClipAsset *self);
GES_API
gboolean ges_clip_asset_get_natural_framerate (GESClipAsset* self, gint* framerate_n, gint* framerate_d);
GES_API
GstClockTime ges_clip_asset_get_frame_time (GESClipAsset* self, GESFrameNumber frame_number);
G_END_DECLS

4519
ges/ges-clip.c Normal file

File diff suppressed because it is too large Load diff

246
ges/ges-clip.h Normal file
View file

@ -0,0 +1,246 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include <glib-object.h>
#include <gst/gst.h>
#include <ges/ges-timeline-element.h>
#include <ges/ges-container.h>
#include <ges/ges-types.h>
#include <ges/ges-track.h>
G_BEGIN_DECLS
#define GES_TYPE_CLIP ges_clip_get_type()
GES_DECLARE_TYPE(Clip, clip, CLIP);
/**
* GES_CLIP_CLASS_CAN_ADD_EFFECTS:
* @klass: A #GESClipClass
*
* Whether the class allows for the user to add additional non-core
* #GESBaseEffect-s to clips from this class.
*/
#define GES_CLIP_CLASS_CAN_ADD_EFFECTS(klass) ((GES_CLIP_CLASS (klass))->ABI.abi.can_add_effects)
/**
* GESFillTrackElementFunc:
* @clip: The #GESClip controlling the track elements
* @track_element: The #GESTrackElement
* @nleobj: The nleobject that needs to be filled
*
* A function that will be called when the nleobject of a corresponding
* track element needs to be filled.
*
* The implementer of this function shall add the proper #GstElement to @nleobj
* using gst_bin_add().
*
* Deprecated: 1.18: This method type is no longer used.
*
* Returns: %TRUE if the implementer successfully filled the @nleobj.
*/
typedef gboolean (*GESFillTrackElementFunc) (GESClip *clip, GESTrackElement *track_element,
GstElement *nleobj);
/**
* GESCreateTrackElementFunc:
* @clip: A #GESClip
* @type: A #GESTrackType to create a #GESTrackElement for
*
* A method for creating the core #GESTrackElement of a clip, to be added
* to a #GESTrack of the given track type.
*
* If a clip may produce several track elements per track type,
* #GESCreateTrackElementsFunc is more appropriate.
*
* Returns: (transfer floating) (nullable): The #GESTrackElement created
* by @clip, or %NULL if @clip can not provide a track element for the
* given @type or an error occurred.
*/
typedef GESTrackElement *(*GESCreateTrackElementFunc) (GESClip * clip, GESTrackType type);
/**
* GESCreateTrackElementsFunc:
* @clip: A #GESClip
* @type: A #GESTrackType to create #GESTrackElement-s for
*
* A method for creating the core #GESTrackElement-s of a clip, to be
* added to #GESTrack-s of the given track type.
*
* Returns: (transfer container) (element-type GESTrackElement): A list of
* the #GESTrackElement-s created by @clip for the given @type, or %NULL
* if no track elements are created or an error occurred.
*/
typedef GList * (*GESCreateTrackElementsFunc) (GESClip * clip, GESTrackType type);
/**
* GESClip:
*/
struct _GESClip
{
GESContainer parent;
/*< private >*/
GESClipPrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING_LARGE];
};
/**
* GESClipClass:
* @create_track_element: Method to create the core #GESTrackElement of a clip
* of this class. If a clip of this class may create several track elements per
* track type, this should be left as %NULL, and
* GESClipClass::create_track_elements should be used instead. Otherwise, you
* should implement this class method and leave
* GESClipClass::create_track_elements as the default implementation
* @create_track_elements: Method to create the (multiple) core
* #GESTrackElement-s of a clip of this class. If
* GESClipClass::create_track_element is implemented, this should be kept as the
* default implementation
* @can_add_effects: Whether the user can add additional non-core
* #GESBaseEffect-s to clips from this class, to be applied to the output data
* of the core elements.
*/
struct _GESClipClass
{
/*< private > */
GESContainerClass parent_class;
/*< public > */
GESCreateTrackElementFunc create_track_element;
GESCreateTrackElementsFunc create_track_elements;
/*< private >*/
/* Padding for API extension */
union {
gpointer _ges_reserved[GES_PADDING_LARGE];
struct {
gboolean can_add_effects;
} abi;
} ABI;
};
/****************************************************
* TrackElement handling *
****************************************************/
GES_API
GESTrackType ges_clip_get_supported_formats (GESClip *clip);
GES_API
void ges_clip_set_supported_formats (GESClip *clip,
GESTrackType supportedformats);
GES_API
GESTrackElement* ges_clip_add_asset (GESClip *clip,
GESAsset *asset);
GES_API
GESTrackElement* ges_clip_find_track_element (GESClip *clip,
GESTrack *track,
GType type);
GES_API
GList * ges_clip_find_track_elements (GESClip * clip,
GESTrack * track,
GESTrackType track_type,
GType type);
GES_API
GESTrackElement * ges_clip_add_child_to_track (GESClip * clip,
GESTrackElement * child,
GESTrack * track,
GError ** error);
/****************************************************
* Layer *
****************************************************/
GES_API
GESLayer* ges_clip_get_layer (GESClip * clip);
GES_API
gboolean ges_clip_move_to_layer (GESClip * clip,
GESLayer * layer);
GES_API
gboolean ges_clip_move_to_layer_full (GESClip * clip,
GESLayer * layer,
GError ** error);
/****************************************************
* Effects *
****************************************************/
GES_API
gboolean ges_clip_add_top_effect (GESClip * clip,
GESBaseEffect * effect,
gint index,
GError ** error);
GES_API
gboolean ges_clip_remove_top_effect (GESClip * clip,
GESBaseEffect * effect,
GError ** error);
GES_API
GList* ges_clip_get_top_effects (GESClip * clip);
GES_API
gint ges_clip_get_top_effect_position (GESClip * clip,
GESBaseEffect * effect);
GES_API
gint ges_clip_get_top_effect_index (GESClip * clip,
GESBaseEffect * effect);
GES_API
gboolean ges_clip_set_top_effect_priority (GESClip * clip,
GESBaseEffect * effect,
guint newpriority);
GES_API
gboolean ges_clip_set_top_effect_index (GESClip * clip,
GESBaseEffect * effect,
guint newindex);
GES_API
gboolean ges_clip_set_top_effect_index_full (GESClip * clip,
GESBaseEffect * effect,
guint newindex,
GError ** error);
/****************************************************
* Editing *
****************************************************/
GES_API
GESClip* ges_clip_split (GESClip *clip,
guint64 position);
GES_API
GESClip* ges_clip_split_full (GESClip *clip,
guint64 position,
GError ** error);
GES_API
GstClockTime ges_clip_get_internal_time_from_timeline_time (GESClip * clip,
GESTrackElement * child,
GstClockTime timeline_time,
GError ** error);
GES_API
GstClockTime ges_clip_get_timeline_time_from_internal_time (GESClip * clip,
GESTrackElement * child,
GstClockTime internal_time,
GError ** error);
GES_API
GstClockTime ges_clip_get_timeline_time_from_source_frame (GESClip * clip,
GESFrameNumber frame_number,
GError ** error);
GES_API
GstClockTime ges_clip_get_duration_limit (GESClip * clip);
G_END_DECLS

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,52 @@
/* GStreamer Editing Services
*
* Copyright (C) <2015> Thibault Saunier <tsaunier@gnome.org>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 59 Temple Place - Suite 330,
* Boston, MA 02111-1307, USA.
*/
#pragma once
#include <glib-object.h>
#include "ges-formatter.h"
G_BEGIN_DECLS
typedef struct _GESCommandLineFormatterClass GESCommandLineFormatterClass;
typedef struct _GESCommandLineFormatter GESCommandLineFormatter;
#define GES_TYPE_COMMAND_LINE_FORMATTER (ges_command_line_formatter_get_type ())
GES_DECLARE_TYPE(CommandLineFormatter, command_line_formatter, COMMAND_LINE_FORMATTER);
struct _GESCommandLineFormatterClass
{
GESFormatterClass parent_class;
};
struct _GESCommandLineFormatter
{
GESFormatter parent_instance;
GESCommandLineFormatterPrivate *priv;
};
GES_API
gchar * ges_command_line_formatter_get_help (gint nargs, gchar ** commands);
GES_API
gchar * ges_command_line_formatter_get_timeline_uri (GESTimeline *timeline);
G_END_DECLS

1089
ges/ges-container.c Normal file

File diff suppressed because it is too large Load diff

158
ges/ges-container.h Normal file
View file

@ -0,0 +1,158 @@
/* GStreamer Editing Services
* Copyright (C) 2009 Edward Hervey <edward.hervey@collabora.co.uk>
* 2009 Nokia Corporation
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#pragma once
#include <glib-object.h>
#include <gst/gst.h>
#include <ges/ges-timeline-element.h>
#include <ges/ges-types.h>
#include <ges/ges-track.h>
G_BEGIN_DECLS
#define GES_TYPE_CONTAINER ges_container_get_type()
GES_DECLARE_TYPE(Container, container, CONTAINER);
/**
* GESChildrenControlMode:
*
* To be used by subclasses only. This indicate how to handle a change in
* a child.
*/
typedef enum
{
GES_CHILDREN_UPDATE,
GES_CHILDREN_IGNORE_NOTIFIES,
GES_CHILDREN_UPDATE_OFFSETS,
GES_CHILDREN_UPDATE_ALL_VALUES,
GES_CHILDREN_LAST
} GESChildrenControlMode;
/**
* GES_CONTAINER_HEIGHT:
* @obj: a #GESContainer
*
* The #GESContainer:height of @obj.
*/
#define GES_CONTAINER_HEIGHT(obj) (((GESContainer*)obj)->height)
/**
* GES_CONTAINER_CHILDREN:
* @obj: a #GESContainer
*
* The #GList containing the children of @obj.
*/
#define GES_CONTAINER_CHILDREN(obj) (((GESContainer*)obj)->children)
/**
* GESContainer:
* @children: (element-type GES.TimelineElement): The list of
* #GESTimelineElement-s controlled by this Container
* @height: The #GESContainer:height of @obj
*
* Note, you may read, but should not modify these properties.
*/
struct _GESContainer
{
GESTimelineElement parent;
/*< public > */
/*< readonly >*/
GList *children;
/* We don't add those properties to the priv struct for optimization and code
* readability purposes */
guint32 height; /* the span of priorities this object needs */
/* <protected> */
GESChildrenControlMode children_control_mode;
/*< readonly >*/
GESTimelineElement *initiated_move;
/*< private >*/
GESContainerPrivate *priv;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING_LARGE];
};
/**
* GESContainerClass:
* @child_added: Virtual method that is called right after a #GESTimelineElement is added
* @child_removed: Virtual method that is called right after a #GESTimelineElement is removed
* @remove_child: Virtual method to remove a child
* @add_child: Virtual method to add a child
* @ungroup: Virtual method to ungroup a container into a list of
* containers
* @group: Virtual method to group a list of containers together under a
* single container
* @edit: Deprecated
*/
struct _GESContainerClass
{
/*< private > */
GESTimelineElementClass parent_class;
/*< public > */
/* signals */
void (*child_added) (GESContainer *container, GESTimelineElement *element);
void (*child_removed) (GESContainer *container, GESTimelineElement *element);
gboolean (*add_child) (GESContainer *container, GESTimelineElement *element);
gboolean (*remove_child) (GESContainer *container, GESTimelineElement *element);
GList* (*ungroup) (GESContainer *container, gboolean recursive);
GESContainer * (*group) (GList *containers);
/* Deprecated and not used anymore */
gboolean (*edit) (GESContainer * container,
GList * layers, gint new_layer_priority,
GESEditMode mode,
GESEdge edge,
guint64 position);
/*< private >*/
guint grouping_priority;
/* Padding for API extension */
gpointer _ges_reserved[GES_PADDING_LARGE];
};
/* Children handling */
GES_API
GList* ges_container_get_children (GESContainer *container, gboolean recursive);
GES_API
gboolean ges_container_add (GESContainer *container, GESTimelineElement *child);
GES_API
gboolean ges_container_remove (GESContainer *container, GESTimelineElement *child);
GES_API
GList * ges_container_ungroup (GESContainer * container, gboolean recursive);
GES_API
GESContainer *ges_container_group (GList *containers);
GES_DEPRECATED_FOR(ges_timeline_element_edit)
gboolean ges_container_edit (GESContainer * container,
GList * layers, gint new_layer_priority,
GESEditMode mode,
GESEdge edge,
guint64 position);
G_END_DECLS

Some files were not shown because too many files have changed in this diff Show more