mirror of
https://gitlab.freedesktop.org/gstreamer/gstreamer.git
synced 2024-06-08 09:09:26 +00:00
doc: remove section tensor storage
This commit is contained in:
parent
0ec825dbb2
commit
019f7493d7
|
@ -218,7 +218,7 @@ re-usable without the upstream model but they bypass the need for
|
|||
tensor-decoding and are very efficient. Another variation is that multiple
|
||||
models are merged into one model removing the need the multi-level inference,
|
||||
but again, this is a design decision involving compromise on re-usability,
|
||||
performance and effort. We aim to provide support for all these use cases,
|
||||
performance and effort. We aim to provide support for all these use cases,
|
||||
and to allow the analytics pipeline designer to make the best design decisions based
|
||||
on his specific context.
|
||||
|
||||
|
@ -251,16 +251,6 @@ specific to machine-learning techniques and can also be used to store analysis
|
|||
results from computer-vision, heuristics or other techniques. It can be used as
|
||||
a bridge between different techniques.
|
||||
|
||||
##### Storing Tensors Into Analytics Meta
|
||||
To be able to describe more precisely analytics results, an analytics pipeline
|
||||
where the output tensor of the first inference stage is directly pushed, without tensor decoding,
|
||||
into a second inference stage. It would be useful to store those tensors using
|
||||
analytics-meta because we could communicate the relation between tensor of first
|
||||
inference and tensor of second inference. With the relation description a
|
||||
tensor-decoder of second inference would be able to retrieve associated tensor of
|
||||
of first inference and extract potentially useful information that is not
|
||||
available the tensor of the second inference.
|
||||
|
||||
### Semantically-Agnostic Tensor Processing
|
||||
Not all tensor processing is model dependent. Sometime the processing can be
|
||||
done uniformly on all tensor's values. For example normalization, range
|
||||
|
|
Loading…
Reference in a new issue