summaryrefslogtreecommitdiff
path: root/Documentation/sound/kernel-api/writing-an-alsa-driver.rst
diff options
context:
space:
mode:
authorDmitry Torokhov <dmitry.torokhov@gmail.com>2023-03-17 14:01:30 +0300
committerDmitry Torokhov <dmitry.torokhov@gmail.com>2023-03-17 14:01:30 +0300
commitd26a3a6ce7e02f9c056ad992bcd9624735022337 (patch)
tree9df8aeaceed50bf65d01172c67f67035c9fa59ef /Documentation/sound/kernel-api/writing-an-alsa-driver.rst
parent007e50eb5dbe7b33a43a1449a0d9c29e8dcf1c67 (diff)
parenteeac8ede17557680855031c6f305ece2378af326 (diff)
downloadlinux-d26a3a6ce7e02f9c056ad992bcd9624735022337.tar.xz
Merge tag 'v6.3-rc2' into next
Merge with mainline to get of_property_present() and other newer APIs.
Diffstat (limited to 'Documentation/sound/kernel-api/writing-an-alsa-driver.rst')
-rw-r--r--Documentation/sound/kernel-api/writing-an-alsa-driver.rst10
1 files changed, 5 insertions, 5 deletions
diff --git a/Documentation/sound/kernel-api/writing-an-alsa-driver.rst b/Documentation/sound/kernel-api/writing-an-alsa-driver.rst
index 07a620c5ca74..5c9523b7d55c 100644
--- a/Documentation/sound/kernel-api/writing-an-alsa-driver.rst
+++ b/Documentation/sound/kernel-api/writing-an-alsa-driver.rst
@@ -1720,16 +1720,16 @@ Typically, you'll have a hardware descriptor as below:
- ``rate_min`` and ``rate_max`` define the minimum and maximum sample
rate. This should correspond somehow to ``rates`` bits.
-- ``channel_min`` and ``channel_max`` define, as you might already
+- ``channels_min`` and ``channels_max`` define, as you might already
expected, the minimum and maximum number of channels.
- ``buffer_bytes_max`` defines the maximum buffer size in
bytes. There is no ``buffer_bytes_min`` field, since it can be
calculated from the minimum period size and the minimum number of
- periods. Meanwhile, ``period_bytes_min`` and define the minimum and
- maximum size of the period in bytes. ``periods_max`` and
- ``periods_min`` define the maximum and minimum number of periods in
- the buffer.
+ periods. Meanwhile, ``period_bytes_min`` and ``period_bytes_max``
+ define the minimum and maximum size of the period in bytes.
+ ``periods_max`` and ``periods_min`` define the maximum and minimum
+ number of periods in the buffer.
The “period” is a term that corresponds to a fragment in the OSS
world. The period defines the size at which a PCM interrupt is