Age | Commit message (Collapse) | Author | Files | Lines |
|
All those files are under GFDL 1.1 or later, with no invariant sections.
Tag them as such.
Signed-off-by: Mauro Carvalho Chehab <mchehab+samsung@kernel.org>
|
|
This is not needed there. Also, the same UTF-8 encoding should
be used on all documents.
Signed-off-by: Mauro Carvalho Chehab <mchehab+samsung@kernel.org>
|
|
On several tables, the color sample location table preamble is
written as:
Color Sample Location..
Instead of:
Color Sample Location:
I suspect that the repetition of such pattern was due to some
copy-and-paste (or perhaps some error during DocBook conversion).
Anyway, fix it.
Signed-off-by: Mauro Carvalho Chehab <mchehab@s-opensource.com>
|
|
Shorten the tables by removing row numbers in comments, allowing for
later insertion of rows with minimal diffs.
All changes have been generated by the following script.
import io
import re
import sys
def process_table(fname, data):
if fname.endswith('hist-v4l2.rst'):
data = re.sub(u'\n{1,2}\t( ?) -( ?) ?', u'\n\t\\1 -\\2', data, flags = re.MULTILINE)
data = re.sub(u'\n(\t| )- \.\. row [0-9]+\n\t ?-( ?) ?', u'\\1* -\\2', data, flags = re.MULTILINE)
else:
data = re.sub(u'\n{1,2} -( ?) ?', u'\n -\\1', data, flags = re.MULTILINE)
data = re.sub(u'(\n?)(\n\n - \.\. row 1\n)', u'\n\\2', data, flags = re.MULTILINE)
data = re.sub(u'\n - \.\. row [0-9]+\n -( ?) ?', u' * -\\1', data, flags = re.MULTILINE)
data = re.sub(u'\n - \.\. row [0-9]+\n \.\. (_[A-Z0-9_`-]*:)', u'\n - .. \\1', data, flags = re.MULTILINE)
data = re.sub(u'\n - \.\. (_[A-Z0-9_`-]*:)\n -', u' * .. \\1\n\n -', data, flags = re.MULTILINE)
data = re.sub(u'^ - ', u' -', data, flags = re.MULTILINE)
data = re.sub(u'^(\t{1,2}) ', u'\\1', data, flags = re.MULTILINE)
return data
def process_file(fname, data):
buf = io.StringIO(data)
output = ''
in_table = False
table_separator = 0
for line in buf.readlines():
if line.find('.. flat-table::') != -1:
in_table = True
table = ''
elif in_table and not re.match('^[\t\n]|( )', line):
in_table = False
output += process_table(fname, table)
if in_table:
table += line
else:
output += line
if in_table:
in_table = False
output += process_table(fname, table)
return output
fname = sys.argv[1]
data = file(fname, 'rb').read().decode('utf-8')
data = process_file(fname, data)
file(fname, 'wb').write(data.encode('utf-8'))
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Those hints are wrong, and doesn't really improve the look
of those tables. So, keep them only when they're useful.
Signed-off-by: Mauro Carvalho Chehab <mchehab@s-opensource.com>
|
|
LaTeX doesn't handle too well auto-width on tables, and ReST
markup requires an special tag to give it the needed hints.
As we're using A4 paper, we have 17cm of useful spaces. As
most media tables have widths, let's use it to generate the
needed via the following perl script:
my ($line_size, $table_header, $has_cols) = (17.5, 0, 0);
my $out;
my $header = "";
my @widths = ();
sub round { $_[0] > 0 ? int($_[0] + .5) : -int(-$_[0] + .5) }
while (<>) {
if (!$table_header) {
$has_cols = 1 if (m/..\s+tabularcolumns::/);
if (m/..\s+flat-table::/) {
$table_header = 1;
$header = $_;
next;
}
$out .= $_;
next;
}
$header .= $_;
@widths = split(/ /, $1) if (m/:widths:\s+(.*)/);
if (m/^\n$/) {
if (!$has_cols && @widths) {
my ($tot, $t, $i) = (0, 0, 0);
foreach my $v(@widths) { $tot += $v; };
$out .= ".. tabularcolumns:: |";
for ($i = 0; $i < scalar @widths - 1; $i++) {
my $v = $widths[$i];
my $w = round(10 * ($v * $line_size) / $tot) / 10;
$out .= sprintf "p{%.1fcm}|", $w;
$t += $w;
}
my $w = $line_size - $t;
$out .= sprintf "p{%.1fcm}|\n\n", $w;
}
$out .= $header;
$table_header = 0;
$has_cols = 0;
$header = "";
@widths = ();
}
}
print $out;
Signed-off-by: Mauro Carvalho Chehab <mchehab@s-opensource.com>
|
|
The name of the subsystem is "media", and not "linux_tv". Also,
as we plan to add other stuff there in the future, let's
rename also the media uAPI book to media_uapi, to make it
clearer.
No functional changes.
Signed-off-by: Mauro Carvalho Chehab <mchehab@s-opensource.com>
|