Hi Steve,
Am Mittwoch, den 10.09.2014, 18:17 -0700 schrieb Steve Longerbeam:
[...]
[...]
Post by Steve LongerbeamPost by Philipp ZabelI have in the meantime started to
implement everything that has a source or destination selector in the
Frame Synchronization Unit (FSU) as media entity. I wonder which of
[...]
Post by Steve LongerbeamI don't really see the need for an SMFC entity. The SMFC control can
be integrated into the CSI subdev.
Granted, this is currently is a theoretical question, but could we
handle a single MIPI link that carries two or more virtual channels with
different MIPI IDs this way?
Post by Steve LongerbeamPost by Philipp ZabelIC preprocessor (input to VF and ENC, if I understood correctly)
IC viewfinder task (scaling, csc)
IC encoding task
IC post processing task
I see either three different IC subdev entities (IC prpenc, IC prpvf,
IC pp), or a single IC entity with three sink pads for each IC task.
The former could work, the latter won't allow to have pre and post
processing on separate pipelines.
Post by Steve LongerbeamPost by Philipp ZabelIRT viewfinder task (rotation)
IRT encoding task
IRT post processing task
well, the IRT is really just a submodule enable bit, I see no need
for an IRT subdev, in fact IRT has already been folded into ipu-ic.c
as a simple submodule enable/disable. Rotation support can be
implemented as part of the IC entities.
My current understanding is that the IRT is strictly a mem2mem device
using its own DMA channels, which can be channel-linked to the IC (and
other blocks) in various ways.
Post by Steve LongerbeamPost by Philipp ZabelVDIC (deinterlacing, combining)
I am thinking VDIC support can be part of the IC prpvf entity (well,
combining is not really on my radar, I haven't given that much thought).
Post by Philipp Zabel(and probably some entry for DP/DC/DMFC for the direct
viewfinder path)
Ugh, I've been ignoring that path as well. Freescale's BSP releases
and sample code from their SDK's have no example code for the
direct-to-DP/DC/DMFC camera viewfinder path, so given the quality
of the imx TRM, this could be a challenge to implement. Have you
gotten this path to work?
Not yet, no.
Post by Steve LongerbeamPost by Philipp ZabelI suppose the SMFC channels need to be separate because they can belong
to different pipelines (and each entity can only belong to one).
I see the chosen SMFC channel as an internal decision by the
CSI subdev.
Can we handle multiple outputs from a single CSI this way?
Post by Steve LongerbeamPost by Philipp ZabelThe three IC task entities could probably be combined with their
corresponding IRT task entity somehow, but that would be at the cost of
not being able to tell the kernel whether to rotate before or after
scaling, which might be useful when handling chroma subsampled formats.
I'm fairly sure IC rotation must always occur _after_ scaling. I.e.
raw frames are first passed through IC prpenc/prpvf/pp for scaling/CSC,
then EOF completion of that task is hardware linked to IRT.
There could be good reasons to do the rotation on the input side, for
example when upscaling or when the output is 4:2:2 subsampled. At least
the FSU registers suggest that channel linking the rotator before the IC
is possible. This probably won't be useful for the capture path in most
cases, but it might be for rotated playback.
Post by Steve LongerbeamPost by Philipp Zabelgit://git.pengutronix.de/git/pza/linux.git test/nitrogen6x-ipu-media
So far I've captured video through the SMFC on a Nitrogen6X board with
OV5652 parallel camera with this.
Thanks Phillip, I'll take a look! Sounds like a good place to start.
I assume this is with the video mux entity and CSI driver? I.e. no
IC entity support yet for scaling, CSC, or rotation.
Yes, exactly.
regards
Philipp