作者 钟来

初始提交

GNU GENERAL PUBLIC LICENSE
Version 2, June 1991
Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users. This
General Public License applies to most of the Free Software
Foundation's software and to any other program whose authors commit to
using it. (Some other Free Software Foundation software is covered by
the GNU Lesser General Public License instead.) You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have. You must make sure that they, too, receive or can get the
source code. And you must show them these terms so they know their
rights.
We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that redistributors of a free
program will individually obtain patent licenses, in effect making the
program proprietary. To prevent this, we have made it clear that any
patent must be licensed for everyone's free use or not licensed at all.
The precise terms and conditions for copying, distribution and
modification follow.
GNU GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License applies to any program or other work which contains
a notice placed by the copyright holder saying it may be distributed
under the terms of this General Public License. The "Program", below,
refers to any such program or work, and a "work based on the Program"
means either the Program or any derivative work under copyright law:
that is to say, a work containing the Program or a portion of it,
either verbatim or with modifications and/or translated into another
language. (Hereinafter, translation is included without limitation in
the term "modification".) Each licensee is addressed as "you".
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running the Program is not restricted, and the output from the Program
is covered only if its contents constitute a work based on the
Program (independent of having been made by running the Program).
Whether that is true depends on what the Program does.
1. You may copy and distribute verbatim copies of the Program's
source code as you receive it, in any medium, provided that you
conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the
notices that refer to this License and to the absence of any warranty;
and give any other recipients of the Program a copy of this License
along with the Program.
You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.
2. You may modify your copy or copies of the Program or any portion
of it, thus forming a work based on the Program, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) You must cause the modified files to carry prominent notices
stating that you changed the files and the date of any change.
b) You must cause any work that you distribute or publish, that in
whole or in part contains or is derived from the Program or any
part thereof, to be licensed as a whole at no charge to all third
parties under the terms of this License.
c) If the modified program normally reads commands interactively
when run, you must cause it, when started running for such
interactive use in the most ordinary way, to print or display an
announcement including an appropriate copyright notice and a
notice that there is no warranty (or else, saying that you provide
a warranty) and that users may redistribute the program under
these conditions, and telling the user how to view a copy of this
License. (Exception: if the Program itself is interactive but
does not normally print such an announcement, your work based on
the Program is not required to print an announcement.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Program,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Program, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Program.
In addition, mere aggregation of another work not based on the Program
with the Program (or with a work based on the Program) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:
a) Accompany it with the complete corresponding machine-readable
source code, which must be distributed under the terms of Sections
1 and 2 above on a medium customarily used for software interchange; or,
b) Accompany it with a written offer, valid for at least three
years, to give any third party, for a charge no more than your
cost of physically performing source distribution, a complete
machine-readable copy of the corresponding source code, to be
distributed under the terms of Sections 1 and 2 above on a medium
customarily used for software interchange; or,
c) Accompany it with the information you received as to the offer
to distribute corresponding source code. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form with such
an offer, in accord with Subsection b above.)
The source code for a work means the preferred form of the work for
making modifications to it. For an executable work, complete source
code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to
control compilation and installation of the executable. However, as a
special exception, the source code distributed need not include
anything that is normally distributed (in either source or binary
form) with the major components (compiler, kernel, and so on) of the
operating system on which the executable runs, unless that component
itself accompanies the executable.
If distribution of executable or object code is made by offering
access to copy from a designated place, then offering equivalent
access to copy the source code from the same place counts as
distribution of the source code, even though third parties are not
compelled to copy the source along with the object code.
4. You may not copy, modify, sublicense, or distribute the Program
except as expressly provided under this License. Any attempt
otherwise to copy, modify, sublicense or distribute the Program is
void, and will automatically terminate your rights under this License.
However, parties who have received copies, or rights, from you under
this License will not have their licenses terminated so long as such
parties remain in full compliance.
5. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Program or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Program (or any work based on the
Program), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Program or works based on it.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the
original licensor to copy, distribute or modify the Program subject to
these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
7. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Program at all. For example, if a patent
license would not permit royalty-free redistribution of the Program by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Program.
If any portion of this section is held invalid or unenforceable under
any particular circumstance, the balance of the section is intended to
apply and the section as a whole is intended to apply in other
circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system, which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
8. If the distribution and/or use of the Program is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Program under this License
may add an explicit geographical distribution limitation excluding
those countries, so that distribution is permitted only in or among
countries not thus excluded. In such case, this License incorporates
the limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the Program
specifies a version number of this License which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation. If the Program does not specify a version number of
this License, you may choose any version ever published by the Free Software
Foundation.
10. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission. For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this. Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.
NO WARRANTY
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License along
with this program; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) year name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, the commands you use may
be called something other than `show w' and `show c'; they could even be
mouse-clicks or menu items--whatever suits your program.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
`Gnomovision' (which makes passes at compilers) written by James Hacker.
<signature of Ty Coon>, 1 April 1989
Ty Coon, President of Vice
This General Public License does not permit incorporating your program into
proprietary programs. If your program is a subroutine library, you may
consider it more useful to permit linking proprietary applications with the
library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License.
... ...
## RTSPtoHTTP-FLV 使用JavaCV开发的rtsp流转http-flv流(rtmp已不推荐)并进行推流的流媒体服务
**求star!!!**
#### 提问求助等优先提交issues,让其他遇到同样问题的朋友可以很方便找到解决方式,尽量避免直接加微信qq咨询。业务合作可发邮件到banmajio@163.com或添加微信qq咨询。
### 各大浏览器目前均已不再支持flash,故推荐使用http-flv来代替rtmp使用。
>[参考资料](https://blog.csdn.net/weixin_40777510/article/details/106693408)
>只需修改本项目controller中rtmp地址生成的地方改为生成http-flv地址即可,各流媒体服务器对于http-flv地址规则可能会有差异,根据所选流媒体服务器来制定http-flv地址。
>**个人博客:[banmajio's blog](https://www.banmajio.com/)**
>**csdn博客:[banmajio's csdn](https://blog.csdn.net/weixin_40777510)**
>**gitee地址:[RTSPtoRTMP](https://gitee.com/banmajio/RTSPtoRTMP)**
### 可以实现各h264编码的监控设备rtsp流转rtmp流(只需要改动controller中rtsp指令的拼接格式)
**接口调用方式:[接口文档](https://github.com/banmajio/RTSPtoRTMP/wiki/%E6%8E%A5%E5%8F%A3%E6%96%87%E6%A1%A3)**
#### [注]:
该项目中的一些处理是为了满足公司项目需求添加完善的,如果需要改造扩展只需要在原来的基础上进行扩充或者剥离即可。最基本的核心操作在CameraPush.java这个类中。
#### 该项目需要搭配使用的nginx服务器下载地址:
[http://cdn.banmajio.com/nginx.rar](http://cdn.banmajio.com/nginx.rar)
下载后解压该文件,点击nginx.exe(闪退是正常的,可以通过任务管理器查看是否存在nginx进程,存在则说明启动成功了)启动nginx服务。nginx的配置文件存放在conf目录下的nginx.conf,根据需要修改。项目中的rtmp地址就是根据这个配置文件来的。
### 存在的问题:
1.部分设备或NVR在进行历史回放时,会出现带宽不足的报错,暂不清楚造成该情况的具体原因。如果出现rtsp地址带时间戳参数进行历史回放出现报错或者无法播放的情况,请考虑使用厂家提供的sdk进行二次开发,捕获码流数据自行处理推成rtmp流。
>**出现此问题的原因参考:**[使用rtsp带starttime和endtime进行历史回放报453 Not Enough Bandwidth(带宽不足)](https://blog.csdn.net/weixin_40777510/article/details/106802234)
2.对于上述历史回放的问题,现在已经通过对接海康的sdk进行二次开发,通过sdk回调的码流数据自行处理推到rtmp。
>**实现思路参考:**[海康sdk捕获码流数据通过JavaCV推成rtmp流的实现思路(PS流转封装RTMP)](https://blog.csdn.net/weixin_40777510/article/details/105840823)
>**项目搭建过程请参考本人博文:[FFmpeg转封装rtsp到rtmp(无需转码,低资源消耗)](https://www.banmajio.com/post/638986b0.html#more)**
>**开发过程的遇到的一些问题和解决方法,会发布到csdn博客中,[banmajio csdn](https://blog.csdn.net/weixin_40777510)**
**感谢[nn200433](https://github.com/nn200433)小伙伴对本项目的支持,详细改动请参考rp分支内的提交内容**
### 碎银打赏,以资奖励
<img src="https://images.gitee.com/uploads/images/2020/0421/174552_a862b4ed_5186477.jpeg" width="200px" />
<img src="https://images.gitee.com/uploads/images/2020/0421/174726_cb99c1d6_5186477.jpeg" width="200px" />
... ...
<h1 class="curproject-name"> camera-rtmp </h1>
# 公共分类
## 获取服务信息
<a id=获取服务信息> </a>
### 基本信息
**Path:** /status
**Method:** GET
**接口描述:**
<p>获取当前服务运行时长以及保活时长、推送IP、推送端口、设备主码流最大码率、设备子码流最大码率的信息。</p>
### 请求参数
### 返回数据
<table>
<thead class="ant-table-thead">
<tr>
<th key=name>名称</th><th key=type>类型</th><th key=required>是否必须</th><th key=default>默认值</th><th key=desc>备注</th><th key=sub>其他信息</th>
</tr>
</thead><tbody className="ant-table-tbody"><tr key=0-0><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> uptime</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">运行时长</span></td><td key=5></td></tr><tr key=0-1><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> config</span></td><td key=1><span>object</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">配置参数</span></td><td key=5></td></tr><tr key=0-1-0><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> keepalive</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">保活时长(分钟)</span></td><td key=5></td></tr><tr key=0-1-1><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> push_host</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">推送IP</span></td><td key=5></td></tr><tr key=0-1-2><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> host_extra</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap"></span></td><td key=5></td></tr><tr key=0-1-3><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> push_port</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">推送端口</span></td><td key=5></td></tr><tr key=0-1-4><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> main_code</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备主码流最大码率</span></td><td key=5></td></tr><tr key=0-1-5><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> sub_code</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备子码流最大码率</span></td><td key=5></td></tr>
</tbody>
</table>
## 获取视频流
<a id=获取视频流> </a>
### 基本信息
**Path:** /cameras
**Method:** GET
**接口描述:**
<p>获取当前正在进行推流的设备信息。</p>
### 请求参数
### 返回数据
<table>
<thead class="ant-table-thead">
<tr>
<th key=name>名称</th><th key=type>类型</th><th key=required>是否必须</th><th key=default>默认值</th><th key=desc>备注</th><th key=sub>其他信息</th>
</tr>
</thead><tbody className="ant-table-tbody"><tr key=0><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> </span></td><td key=1><span>object []</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap"></span></td><td key=5><p key=3><span style="font-weight: '700'">item 类型: </span><span>object</span></p></td></tr><tr key=0-0><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> ip</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备用户名</span></td><td key=5></td></tr><tr key=0-1><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> username</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备密码</span></td><td key=5></td></tr><tr key=0-2><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> password</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备ip</span></td><td key=5></td></tr><tr key=0-3><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> channel</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">通道号</span></td><td key=5></td></tr><tr key=0-4><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> stream</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">码流(历史流不返回码流)</span></td><td key=5></td></tr><tr key=0-5><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> rtsp</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">取流地址</span></td><td key=5></td></tr><tr key=0-6><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> rtmp</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">推流地址</span></td><td key=5></td></tr><tr key=0-7><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> url</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">播放地址</span></td><td key=5></td></tr><tr key=0-8><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> startTime</span></td><td key=1><span>string</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">开始时间(直播流没有开始时间)</span></td><td key=5></td></tr><tr key=0-9><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> endTime</span></td><td key=1><span>string</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">结束时间(直播流没有结束时间)</span></td><td key=5></td></tr><tr key=0-10><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> openTime</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">打开时间</span></td><td key=5></td></tr><tr key=0-11><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> count</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">使用人数</span></td><td key=5></td></tr><tr key=0-12><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> token</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">令牌</span></td><td key=5></td></tr>
</tbody>
</table>
## 开启视频流
<a id=开启视频流> </a>
### 基本信息
**Path:** /cameras
**Method:** POST
**接口描述:**
<p>通过传入参数将rtsp流转为rtmp流进行推送。(历史流推送时,如果该设备正在推流则返回“当前视频正在使用中...”)</p>
### 请求参数
**Headers**
| 参数名称 | 参数值 | 是否必须 | 示例 | 备注 |
| ------------ | ------------ | ------------ | ------------ | ------------ |
| Content-Type | application/json | 是 | | |
**Body**
<table>
<thead class="ant-table-thead">
<tr>
<th key=name>名称</th><th key=type>类型</th><th key=required>是否必须</th><th key=default>默认值</th><th key=desc>备注</th><th key=sub>其他信息</th>
</tr>
</thead><tbody className="ant-table-tbody"><tr key=0-0><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> ip</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备IP</span></td><td key=5></td></tr><tr key=0-1><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> username</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备用户名</span></td><td key=5></td></tr><tr key=0-2><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> password</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备密码</span></td><td key=5></td></tr><tr key=0-3><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> channel</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">通道号</span></td><td key=5></td></tr><tr key=0-4><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> stream</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">码流(直播流需要指定码流;历史流不需要指定码流)</span></td><td key=5></td></tr><tr key=0-5><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> startTime</span></td><td key=1><span>string</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">开始时间(直播流没有开始时间)</span></td><td key=5></td></tr><tr key=0-6><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> endTime</span></td><td key=1><span>string</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">结束时间(直播流没有结束时间)</span></td><td key=5></td></tr>
</tbody>
</table>
### 返回数据
<table>
<thead class="ant-table-thead">
<tr>
<th key=name>名称</th><th key=type>类型</th><th key=required>是否必须</th><th key=default>默认值</th><th key=desc>备注</th><th key=sub>其他信息</th>
</tr>
</thead><tbody className="ant-table-tbody"><tr key=0-0><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> token</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">令牌</span></td><td key=5></td></tr><tr key=0-1><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> uri</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">推流地址</span></td><td key=5></td></tr>
</tbody>
</table>
## 关闭视频流
<a id=关闭视频流> </a>
### 基本信息
**Path:** /cameras/:tokens
**Method:** DELETE
**接口描述:**
<p>关闭正在进行的推流任务。</p>
### 请求参数
**路径参数**
| 参数名称 | 示例 | 备注 |
| ------------ | ------------ | ------------ |
| tokens | | 令牌 |
## 视频流保活
<a id=视频流保活> </a>
### 基本信息
**Path:** /cameras/:tokens
**Method:** PUT
**接口描述:**
<p>对正在推送的视频流进行保活。</p>
### 请求参数
**路径参数**
| 参数名称 | 示例 | 备注 |
| ------------ | ------------ | ------------ |
| tokens | | 令牌 |
... ...
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.2.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<groupId>com.junction</groupId>
<artifactId>CameraServer</artifactId>
<version>${COMMIT_REV}.${BUILD_DATE}</version>
<name>CameraServer</name>
<description>Demo project for Spring Boot</description>
<properties>
<java.version>1.8</java.version>
<version>${COMMIT_REV}.${BUILD_DATE}</version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- <dependency> -->
<!-- <groupId>org.bytedeco</groupId> -->
<!-- <artifactId>javacv</artifactId> -->
<!-- <version>1.5.2</version> -->
<!-- </dependency> -->
<!-- <dependency> -->
<!-- <groupId>org.bytedeco</groupId> -->
<!-- <artifactId>ffmpeg-platform</artifactId> -->
<!-- <version>4.2.1-1.5.2</version> -->
<!-- </dependency> -->
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacv</artifactId>
<version>1.5.3</version>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>ffmpeg-platform</artifactId>
<version>4.2.2-1.5.3</version>
</dependency>
<!-- 支持 @ConfigurationProperties 注解 -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-configuration-processor</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.68</version>
</dependency>
</dependencies>
<build>
<finalName>rtsp_rtmp</finalName><!-- 指定package生成的文件名为my-spring-boot.jar -->
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
... ...
package com.junction;
import java.util.Date;
import java.util.Set;
import javax.annotation.PreDestroy;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.ApplicationContext;
import com.junction.cache.CacheUtil;
import com.junction.controller.CameraController;
import com.junction.push.CameraPush;
import com.junction.thread.CameraThread;
import com.junction.timer.CameraTimer;
@SpringBootApplication
public class CameraServerApplication {
private final static Logger logger = LoggerFactory.getLogger(CameraServerApplication.class);
public static void main(String[] args) {
// 服务启动执行FFmpegFrameGrabber和FFmpegFrameRecorder的tryLoad(),以免导致第一次推流时耗时。
try {
FFmpegFrameGrabber.tryLoad();
FFmpegFrameRecorder.tryLoad();
} catch (org.bytedeco.javacv.FrameRecorder.Exception e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
// 将服务启动时间存入缓存
CacheUtil.STARTTIME = new Date().getTime();
final ApplicationContext applicationContext = SpringApplication.run(CameraServerApplication.class, args);
// 将上下文传入RealPlay类中,以使其使用config中的变量
CameraPush.setApplicationContext(applicationContext);
}
@PreDestroy
public void destory() {
logger.info("服务结束,开始释放空间...");
// 结束正在进行的任务
Set<String> keys = CameraController.JOBMAP.keySet();
for (String key : keys) {
CameraController.JOBMAP.get(key).setInterrupted(key);
}
// 关闭线程池
CameraThread.MyRunnable.es.shutdown();
// 销毁定时器
CameraTimer.timer.cancel();
}
}
... ...
package com.junction;
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.boot.web.servlet.support.SpringBootServletInitializer;
public class ServletInitializer extends SpringBootServletInitializer {
@Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
return application.sources(CameraServerApplication.class);
}
}
... ...
package com.junction.cache;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import com.junction.pojo.CameraPojo;
import com.junction.push.CameraPush;
/**
* @Title CacheUtil.java
* @description 推流缓存信息
* @time 2019年12月17日 下午3:12:45
* @author wuguodong
**/
public final class CacheUtil {
/*
* 保存已经开始推的流
*/
public static Map<String, CameraPojo> STREATMAP = new ConcurrentHashMap<String, CameraPojo>();
/*
* 保存push
*/
public static Map<String, CameraPush> PUSHMAP = new ConcurrentHashMap<>();
/*
* 保存服务启动时间
*/
public static long STARTTIME;
}
... ...
package com.junction.controller;
import java.io.IOException;
import java.net.InetSocketAddress;
import java.net.Socket;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Set;
import java.util.UUID;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RestController;
import com.alibaba.fastjson.JSONObject;
import com.junction.cache.CacheUtil;
import com.junction.pojo.CameraPojo;
import com.junction.pojo.Config;
import com.junction.thread.CameraThread;
import com.junction.util.Utils;
/**
* @Title CameraController.java
* @description controller
* @time 2019年12月16日 上午9:00:27
* @author wuguodong
**/
@RestController
public class CameraController {
private final static Logger logger = LoggerFactory.getLogger(CameraController.class);
@Autowired
public Config config;// 配置文件bean
// 存放任务 线程
public static Map<String, CameraThread.MyRunnable> JOBMAP = new HashMap<String, CameraThread.MyRunnable>();
/**
* @Title: openCamera
* @Description: 开启视频流
* @param ip
* @param username
* @param password
* @param channel 通道
* @param stream 码流
* @param starttime
* @param endtime
* @return Map<String,String>
**/
@RequestMapping(value = "/cameras", method = RequestMethod.POST)
public Map<String, Object> openCamera(@RequestBody CameraPojo pojo) {
// 返回结果
Map<String, Object> map = new LinkedHashMap<String, Object>();
// openStream返回结果
Map<String, Object> openMap = new HashMap<>();
JSONObject cameraJson = JSONObject.parseObject(JSONObject.toJSON(pojo).toString());
// 需要校验非空的参数
String[] isNullArr = { "ip", "username", "password", "channel", "stream" };
// 空值校验
if (!Utils.isNullParameters(cameraJson, isNullArr)) {
map.put("msg", "输入参数不完整");
map.put("code", 1);
return map;
}
// ip格式校验
if (!Utils.isTrueIp(pojo.getIp())) {
map.put("msg", "ip格式输入错误");
map.put("code", 2);
return map;
}
if (null != pojo.getStarttime() || "".equals(pojo.getStarttime())) {
// 开始时间校验
if (!Utils.isTrueTime(pojo.getStarttime())) {
map.put("msg", "starttime格式输入错误");
map.put("code", 3);
return map;
}
if (null != pojo.getEndtime() || "".equals(pojo.getEndtime())) {
if (!Utils.isTrueTime(pojo.getEndtime())) {
map.put("msg", "endtime格式输入错误");
map.put("code", 4);
return map;
}
// 结束时间要大于开始时间
try {
long starttime = new SimpleDateFormat("yyyy-MM-dd HH:ss:mm").parse(pojo.getStarttime()).getTime();
long endtime = new SimpleDateFormat("yyyy-MM-dd HH:ss:mm").parse(pojo.getEndtime()).getTime();
if (endtime - starttime < 0) {
map.put("msg", "endtime需要大于starttime");
map.put("code", 5);
return map;
}
} catch (ParseException e) {
logger.error(e.getMessage());
}
}
}
CameraPojo cameraPojo = new CameraPojo();
// 获取当前时间
String opentime = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").format(new Date().getTime());
Set<String> keys = CacheUtil.STREATMAP.keySet();
// 缓存是否为空
if (0 == keys.size()) {
// 开始推流
openMap = openStream(pojo.getIp(), pojo.getUsername(), pojo.getPassword(), pojo.getChannel(),
pojo.getStream(), pojo.getStarttime(), pojo.getEndtime(), opentime);
if (Integer.parseInt(openMap.get("errorcode").toString()) == 0) {
map.put("url", ((CameraPojo) openMap.get("pojo")).getUrl());
map.put("token", ((CameraPojo) openMap.get("pojo")).getToken());
map.put("msg", "打开视频流成功");
map.put("code", 0);
} else {
map.put("msg", openMap.get("message"));
map.put("code", openMap.get("errorcode"));
}
} else {
// 是否存在的标志;false:不存在;true:存在
boolean sign = false;
if (null == pojo.getStarttime()) {// 直播流
for (String key : keys) {
if (pojo.getIp().equals(CacheUtil.STREATMAP.get(key).getIp())
&& pojo.getChannel().equals(CacheUtil.STREATMAP.get(key).getChannel())
&& null == CacheUtil.STREATMAP.get(key).getStarttime()) {// 存在直播流
cameraPojo = CacheUtil.STREATMAP.get(key);
sign = true;
break;
}
}
if (sign) {// 存在
cameraPojo.setCount(cameraPojo.getCount() + 1);
cameraPojo.setOpentime(opentime);
map.put("url", cameraPojo.getUrl());
map.put("token", cameraPojo.getToken());
map.put("msg", "打开视频流成功");
map.put("code", 0);
} else {
openMap = openStream(pojo.getIp(), pojo.getUsername(), pojo.getPassword(), pojo.getChannel(),
pojo.getStream(), pojo.getStarttime(), pojo.getEndtime(), opentime);
if (Integer.parseInt(openMap.get("errorcode").toString()) == 0) {
map.put("url", ((CameraPojo) openMap.get("pojo")).getUrl());
map.put("token", ((CameraPojo) openMap.get("pojo")).getToken());
map.put("msg", "打开视频流成功");
map.put("code", 0);
} else {
map.put("msg", openMap.get("message"));
map.put("code", openMap.get("errorcode"));
}
}
} else {// 历史流
for (String key : keys) {
if (pojo.getIp().equals(CacheUtil.STREATMAP.get(key).getIp())
&& CacheUtil.STREATMAP.get(key).getStarttime() != null) {// 存在历史流
sign = true;
cameraPojo = CacheUtil.STREATMAP.get(key);
break;
}
}
if (sign && cameraPojo.getCount() == 0) {
map.put("msg", "设备正在结束回放,请稍后再试");
map.put("code", 9);
} else if (sign && cameraPojo.getCount() != 0) {
map.put("msg", "设备正在进行回放,请稍后再试");
map.put("code", 8);
} else {
openMap = openStream(pojo.getIp(), pojo.getUsername(), pojo.getPassword(), pojo.getChannel(),
pojo.getStream(), pojo.getStarttime(), pojo.getEndtime(), opentime);
if (Integer.parseInt(openMap.get("errorcode").toString()) == 0) {
map.put("url", ((CameraPojo) openMap.get("pojo")).getUrl());
map.put("token", ((CameraPojo) openMap.get("pojo")).getToken());
map.put("msg", "打开视频流成功");
map.put("code", 0);
} else {
map.put("msg", openMap.get("message"));
map.put("code", openMap.get("errorcode"));
}
}
}
}
return map;
}
/**
* @Title: openStream
* @Description: 推流器
* @param ip
* @param username
* @param password
* @param channel
* @param stream
* @param starttime
* @param endtime
* @param openTime
* @return
* @return CameraPojo
* @throws IOException
**/
private Map<String, Object> openStream(String ip, String username, String password, String channel, String stream,
String starttime, String endtime, String opentime) {
Map<String, Object> map = new HashMap<>();
CameraPojo cameraPojo = new CameraPojo();
// 生成token
String token = UUID.randomUUID().toString();
String rtsp = "";
String rtmp = "";
String IP = Utils.IpConvert(ip);
String url = "";
boolean sign = false;// 该nvr是否再回放,true:在回放;false: 没在回放
// 历史流
if (null != starttime && !"".equals(starttime)) {
if (null != endtime && !"".equals(endtime)) {
rtsp = "rtsp://" + username + ":" + password + "@" + IP + ":554/Streaming/tracks/"
+ (Integer.valueOf(channel) - 32) + "01?starttime=" + Utils.getTime(starttime).substring(0, 8)
+ "t" + Utils.getTime(starttime).substring(8) + "z'&'endtime="
+ Utils.getTime(endtime).substring(0, 8) + "t" + Utils.getTime(endtime).substring(8) + "z";
cameraPojo.setStarttime(Utils.getTime(starttime));
cameraPojo.setEndTime(Utils.getTime(endtime));
} else {
String startTime = Utils.getStarttime(starttime);
String endTime = Utils.getEndtime(starttime);
rtsp = "rtsp://" + username + ":" + password + "@" + IP + ":554/Streaming/tracks/"
+ (Integer.valueOf(channel) - 32) + "01?starttime=" + startTime.substring(0, 8) + "t"
+ startTime.substring(8) + "z'&'endtime=" + endTime.substring(0, 8) + "t" + endTime.substring(8)
+ "z";
cameraPojo.setStarttime(Utils.getStarttime(starttime));
cameraPojo.setEndTime(Utils.getEndtime(starttime));
}
// rtmp = "rtmp://" + Utils.IpConvert(config.getPush_host()) + ":" + config.getPush_port() + "/history/"
// + token;
rtmp = "rtmp://" + Utils.IpConvert(config.getPush_host()) + ":" + config.getPush_port() + "/history/test";
if (config.getHost_extra().equals("127.0.0.1")) {
url = rtmp;
} else {
url = "rtmp://" + Utils.IpConvert(config.getHost_extra()) + ":" + config.getPush_port() + "/history/"
+ token;
}
} else {// 直播流
rtsp = "rtsp://" + username + ":" + password + "@" + IP + ":554/h264/ch" + channel + "/" + stream
+ "/av_stream";
rtmp = "rtmp://" + Utils.IpConvert(config.getPush_host()) + ":" + config.getPush_port() + "/live/" + token;
if (config.getHost_extra().equals("127.0.0.1")) {
url = rtmp;
} else {
url = "rtmp://" + Utils.IpConvert(config.getHost_extra()) + ":" + config.getPush_port() + "/live/"
+ token;
}
}
cameraPojo.setUsername(username);
cameraPojo.setPassword(password);
cameraPojo.setIp(IP);
cameraPojo.setChannel(channel);
cameraPojo.setStream(stream);
cameraPojo.setRtsp(rtsp);
cameraPojo.setRtmp(rtmp);
cameraPojo.setUrl(url);
cameraPojo.setOpentime(opentime);
cameraPojo.setCount(1);
cameraPojo.setToken(token);
// 解决ip输入错误时,grabber.start();出现阻塞无法释放grabber而导致后续推流无法进行;
Socket rtspSocket = new Socket();
Socket rtmpSocket = new Socket();
// 建立TCP Scoket连接,超时时间1s,如果成功继续执行,否则return
try {
rtspSocket.connect(new InetSocketAddress(cameraPojo.getIp(), 554), 1000);
} catch (IOException e) {
logger.error("与拉流IP: " + cameraPojo.getIp() + " 端口: 554 建立TCP连接失败!");
map.put("pojo", cameraPojo);
map.put("errorcode", 6);
map.put("message", "与拉流IP: " + cameraPojo.getIp() + " 端口: 554 建立TCP连接失败!");
return map;
}
try {
rtmpSocket.connect(new InetSocketAddress(Utils.IpConvert(config.getPush_host()),
Integer.parseInt(config.getPush_port())), 1000);
} catch (IOException e) {
logger.error("与推流IP: " + config.getPush_host() + " 端口: " + config.getPush_port() + " 建立TCP连接失败!");
map.put("pojo", cameraPojo);
map.put("errorcode", 7);
map.put("message",
"与推流IP:" + config.getPush_host() + " 端口: " + config.getPush_port() + " 建立连接失败,请检查nginx服务");
return map;
}
// 执行任务
CameraThread.MyRunnable job = new CameraThread.MyRunnable(cameraPojo);
CameraThread.MyRunnable.es.execute(job);
JOBMAP.put(token, job);
map.put("pojo", cameraPojo);
map.put("errorcode", 0);
map.put("message", "打开视频流成功");
return map;
}
/**
* @Title: closeCamera
* @Description:关闭视频流
* @param tokens
* @return void
**/
@RequestMapping(value = "/cameras/{tokens}", method = RequestMethod.DELETE)
public void closeCamera(@PathVariable("tokens") String tokens) {
if (null != tokens && !"".equals(tokens)) {
String[] tokenArr = tokens.split(",");
for (String token : tokenArr) {
if (JOBMAP.containsKey(token) && CacheUtil.STREATMAP.containsKey(token)) {
// 回放手动关闭
if (null != CacheUtil.STREATMAP.get(token).getStarttime()) {
if (0 == CacheUtil.STREATMAP.get(token).getCount() - 1) {
CacheUtil.PUSHMAP.get(token).setExitcode(1);
CacheUtil.STREATMAP.get(token).setCount(CacheUtil.STREATMAP.get(token).getCount() - 1);
} else {
CacheUtil.STREATMAP.get(token).setCount(CacheUtil.STREATMAP.get(token).getCount() - 1);
logger.info("当前设备正在进行回放,使用人数为" + CacheUtil.STREATMAP.get(token).getCount() + " 设备信息:[ip:"
+ CacheUtil.STREATMAP.get(token).getIp() + " channel:"
+ CacheUtil.STREATMAP.get(token).getChannel() + " stream:"
+ CacheUtil.STREATMAP.get(token).getStream() + " statrtime:"
+ CacheUtil.STREATMAP.get(token).getStream() + " endtime:"
+ CacheUtil.STREATMAP.get(token).getEndtime() + " url:"
+ CacheUtil.STREATMAP.get(token).getUrl() + "]");
}
} else {
if (0 < CacheUtil.STREATMAP.get(token).getCount()) {
// 人数-1
CacheUtil.STREATMAP.get(token).setCount(CacheUtil.STREATMAP.get(token).getCount() - 1);
logger.info("关闭成功 当前设备使用人数为" + CacheUtil.STREATMAP.get(token).getCount() + " 设备信息:[ip:"
+ CacheUtil.STREATMAP.get(token).getIp() + " channel:"
+ CacheUtil.STREATMAP.get(token).getChannel() + " stream:"
+ CacheUtil.STREATMAP.get(token).getStream() + " statrtime:"
+ CacheUtil.STREATMAP.get(token).getStream() + " endtime:"
+ CacheUtil.STREATMAP.get(token).getEndtime() + " url:"
+ CacheUtil.STREATMAP.get(token).getUrl() + "]");
}
}
}
}
}
}
/**
* @Title: getCameras
* @Description:获取视频流
* @return Map<String, CameraPojo>
**/
@RequestMapping(value = "/cameras", method = RequestMethod.GET)
public Map<String, CameraPojo> getCameras() {
logger.info("获取视频流信息:" + CacheUtil.STREATMAP.toString());
return CacheUtil.STREATMAP;
}
/**
* @Title: keepAlive
* @Description:视频流保活
* @param tokens
* @return void
**/
@RequestMapping(value = "/cameras/{tokens}", method = RequestMethod.PUT)
public void keepAlive(@PathVariable("tokens") String tokens) {
// 校验参数
if (null != tokens && !"".equals(tokens)) {
String[] tokenArr = tokens.split(",");
for (String token : tokenArr) {
CameraPojo cameraPojo = new CameraPojo();
// 直播流token
if (null != CacheUtil.STREATMAP.get(token)) {
cameraPojo = CacheUtil.STREATMAP.get(token);
// 更新当前系统时间
cameraPojo.setOpentime(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").format(new Date().getTime()));
logger.info("保活成功 设备信息:[ip:" + cameraPojo.getIp() + " channel:" + cameraPojo.getChannel()
+ " stream:" + cameraPojo.getStream() + " starttime:" + cameraPojo.getStarttime()
+ " endtime:" + cameraPojo.getEndtime() + " url:" + cameraPojo.getUrl() + "]");
}
}
}
}
/**
* @Title: getConfig
* @Description: 获取服务信息
* @return Map<String, Object>
**/
@RequestMapping(value = "/status", method = RequestMethod.GET)
public Map<String, Object> getConfig() {
// 获取当前时间
long nowTime = new Date().getTime();
String upTime = (nowTime - CacheUtil.STARTTIME) / (1000 * 60 * 60) + "h"
+ (nowTime - CacheUtil.STARTTIME) % (1000 * 60 * 60) / (1000 * 60) + "m"
+ (nowTime - CacheUtil.STARTTIME) % (1000 * 60 * 60) / (1000) + "s";
logger.info("获取服务信息:" + config.toString() + ";服务运行时间:" + upTime);
Map<String, Object> status = new HashMap<String, Object>();
status.put("config", config);
status.put("uptime", upTime);
return status;
}
}
... ...
package com.junction.pojo;
import java.io.Serializable;
public class CameraPojo implements Serializable {
private static final long serialVersionUID = 8183688502930584159L;
private String username;// 摄像头账号
private String password;// 摄像头密码
private String ip;// 摄像头ip
private String channel;// 摄像头通道
private String stream;// 摄像头码流
private String rtsp;// rtsp地址
private String rtmp;// rtmp地址
private String url;// 播放地址
private String starttime;// 回放开始时间
private String endtime;// 回放结束时间
private String opentime;// 打开时间
private int count = 0;// 使用人数
private String token;
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public String getIp() {
return ip;
}
public void setIp(String ip) {
this.ip = ip;
}
public String getChannel() {
return channel;
}
public void setChannel(String channel) {
this.channel = channel;
}
public String getStream() {
return stream;
}
public void setStream(String stream) {
this.stream = stream;
}
public String getRtsp() {
return rtsp;
}
public void setRtsp(String rtsp) {
this.rtsp = rtsp;
}
public String getRtmp() {
return rtmp;
}
public void setRtmp(String rtmp) {
this.rtmp = rtmp;
}
public String getStarttime() {
return starttime;
}
public void setStarttime(String starttime) {
this.starttime = starttime;
}
public String getEndtime() {
return endtime;
}
public void setEndTime(String endtime) {
this.endtime = endtime;
}
public String getOpentime() {
return opentime;
}
public void setOpentime(String opentime) {
this.opentime = opentime;
}
public int getCount() {
return count;
}
public void setCount(int count) {
this.count = count;
}
public String getToken() {
return token;
}
public void setToken(String token) {
this.token = token;
}
public String getUrl() {
return url;
}
public void setUrl(String url) {
this.url = url;
}
@Override
public String toString() {
return "CameraPojo [username=" + username + ", password=" + password + ", ip=" + ip + ", channel=" + channel
+ ", stream=" + stream + ", rtsp=" + rtsp + ", rtmp=" + rtmp + ", url=" + url + ", starttime="
+ starttime + ", endtime=" + endtime + ", opentime=" + opentime + ", count=" + count + ", token="
+ token + "]";
}
}
... ...
package com.junction.pojo;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.stereotype.Component;
/**
* @Title ConfigPojo.java
* @description 读取配置文件的bean
* @time 2019年12月25日 下午5:11:21
* @author wuguodong
**/
@Component
@ConfigurationProperties(prefix = "config")
public class Config {
private String keepalive;// 保活时长(分钟)
private String push_host;// 推送地址
private String host_extra;// 额外地址
private String push_port;// 推送端口
private String main_code;// 主码流最大码率
private String sub_code;// 主码流最大码率
private String version;// 版本信息
public String getHost_extra() {
return host_extra;
}
public void setHost_extra(String host_extra) {
this.host_extra = host_extra;
}
public String getKeepalive() {
return keepalive;
}
public void setKeepalive(String keepalive) {
this.keepalive = keepalive;
}
public String getPush_host() {
return push_host;
}
public void setPush_host(String push_host) {
this.push_host = push_host;
}
public String getPush_port() {
return push_port;
}
public void setPush_port(String push_port) {
this.push_port = push_port;
}
public String getMain_code() {
return main_code;
}
public void setMain_code(String main_code) {
this.main_code = main_code;
}
public String getSub_code() {
return sub_code;
}
public void setSub_code(String sub_code) {
this.sub_code = sub_code;
}
public String getVersion() {
return version;
}
public void setVersion(String version) {
this.version = version;
}
@Override
public String toString() {
return "Config [keepalive=" + keepalive + ", push_host=" + push_host + ", host_extra=" + host_extra
+ ", push_port=" + push_port + ", main_code=" + main_code + ", sub_code=" + sub_code + ", version="
+ version + "]";
}
}
... ...
package com.junction.push;
import static org.bytedeco.ffmpeg.global.avcodec.av_packet_unref;
import java.util.HashMap;
import java.util.Map;
import org.bytedeco.ffmpeg.avcodec.AVPacket;
import org.bytedeco.ffmpeg.avformat.AVFormatContext;
import org.bytedeco.ffmpeg.global.avcodec;
import org.bytedeco.ffmpeg.global.avutil;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FFmpegLogCallback;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;
import com.junction.pojo.CameraPojo;
import com.junction.pojo.Config;
/**
* @Title RtmpPush.java
* @description javacv推数据帧
* @time 2020年3月17日 下午2:32:42
* @author wuguodong
**/
public class CameraPush {
private final static Logger logger = LoggerFactory.getLogger(CameraPush.class);
// 配置类
private static Config config;
// 通过applicationContext上下文获取Config类
public static void setApplicationContext(ApplicationContext applicationContext) {
config = applicationContext.getBean(Config.class);
}
private CameraPojo pojo;// 设备信息
private FFmpegFrameRecorder recorder;// 解码器
private FFmpegFrameGrabber grabber;// 采集器
private int err_index = 0;// 推流过程中出现错误的次数
private int exitcode = 0;// 退出状态码:0-正常退出;1-手动中断;
private double framerate = 0;// 帧率
public void setExitcode(int exitcode) {
this.exitcode = exitcode;
}
public int getExitcode() {
return exitcode;
}
public CameraPush(CameraPojo cameraPojo) {
this.pojo = cameraPojo;
}
/**
* @Title: release
* @Description:资源释放
* @return void
**/
public void release() {
try {
grabber.stop();
grabber.close();
if (recorder != null) {
recorder.stop();
recorder.release();
}
} catch (Exception e) {
e.printStackTrace();
}
}
/**
* @Title: push
* @Description:推送视频流数据包
* @return void
**/
public void push() {
try {
avutil.av_log_set_level(avutil.AV_LOG_INFO);
FFmpegLogCallback.set();
grabber = new FFmpegFrameGrabber(pojo.getRtsp());
grabber.setOption("rtsp_transport", "tcp");
// 设置采集器构造超时时间
grabber.setOption("stimeout", "2000000");
if ("sub".equals(pojo.getStream())) {
grabber.start(config.getSub_code());
} else if ("main".equals(pojo.getStream())) {
grabber.start(config.getMain_code());
} else {
grabber.start(config.getMain_code());
}
// 部分监控设备流信息里携带的帧率为9000,如出现此问题,会导致dts、pts时间戳计算失败,播放器无法播放,故出现错误的帧率时,默认为25帧
if (grabber.getFrameRate() > 0 && grabber.getFrameRate() < 100) {
framerate = grabber.getFrameRate();
} else {
framerate = 25.0;
}
int width = grabber.getImageWidth();
int height = grabber.getImageHeight();
// 若视频像素值为0,说明拉流异常,程序结束
if (width == 0 && height == 0) {
logger.error(pojo.getRtsp() + " 拉流异常!");
grabber.stop();
grabber.close();
release();
return;
}
recorder = new FFmpegFrameRecorder(pojo.getRtmp(), grabber.getImageWidth(), grabber.getImageHeight());
recorder.setInterleaved(true);
// 关键帧间隔,一般与帧率相同或者是视频帧率的两倍
recorder.setGopSize((int) framerate * 2);
// 视频帧率(保证视频质量的情况下最低25,低于25会出现闪屏)
recorder.setFrameRate(framerate);
// 设置比特率
recorder.setVideoBitrate(grabber.getVideoBitrate());
// 封装flv格式
recorder.setFormat("flv");
// h264编/解码器
recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
Map<String, String> videoOption = new HashMap<>();
// 该参数用于降低延迟
videoOption.put("tune", "zerolatency");
/**
** 权衡quality(视频质量)和encode speed(编码速度) values(值): *
* ultrafast(终极快),superfast(超级快), veryfast(非常快), faster(很快), fast(快), *
* medium(中等), slow(慢), slower(很慢), veryslow(非常慢) *
* ultrafast(终极快)提供最少的压缩(低编码器CPU)和最大的视频流大小;而veryslow(非常慢)提供最佳的压缩(高编码器CPU)的同时降低视频流的大小
*/
videoOption.put("preset", "ultrafast");
// 画面质量参数,0~51;18~28是一个合理范围
videoOption.put("crf", "28");
recorder.setOptions(videoOption);
AVFormatContext fc = grabber.getFormatContext();
recorder.start(fc);
logger.debug("开始推流 设备信息:[ip:" + pojo.getIp() + " channel:" + pojo.getChannel() + " stream:"
+ pojo.getStream() + " starttime:" + pojo.getStarttime() + " endtime:" + pojo.getEndtime()
+ " rtsp:" + pojo.getRtsp() + " url:" + pojo.getUrl() + "]");
// 清空探测时留下的缓存
grabber.flush();
AVPacket pkt = null;
long dts = 0;
long pts = 0;
int timebase = 0;
for (int no_frame_index = 0; no_frame_index < 5 && err_index < 5;) {
long time1 = System.currentTimeMillis();
if (exitcode == 1) {
break;
}
pkt = grabber.grabPacket();
if (pkt == null || pkt.size() == 0 || pkt.data() == null) {
// 空包记录次数跳过
logger.warn("JavaCV 出现空包 设备信息:[ip:" + pojo.getIp() + " channel:" + pojo.getChannel() + " stream:"
+ pojo.getStream() + " starttime:" + pojo.getStarttime() + " endtime:" + " rtsp:"
+ pojo.getRtsp() + pojo.getEndtime() + " url:" + pojo.getUrl() + "]");
no_frame_index++;
continue;
}
// 过滤音频
if (pkt.stream_index() == 1) {
av_packet_unref(pkt);
continue;
}
// 矫正sdk回调数据的dts,pts每次不从0开始累加所导致的播放器无法续播问题
pkt.pts(pts);
pkt.dts(dts);
err_index += (recorder.recordPacket(pkt) ? 0 : 1);
// pts,dts累加
timebase = grabber.getFormatContext().streams(pkt.stream_index()).time_base().den();
pts += timebase / (int) framerate;
dts += timebase / (int) framerate;
// 将缓存空间的引用计数-1,并将Packet中的其他字段设为初始值。如果引用计数为0,自动的释放缓存空间。
av_packet_unref(pkt);
long endtime = System.currentTimeMillis();
if ((long) (1000 /framerate) - (endtime - time1) > 0) {
Thread.sleep((long) (1000 / framerate) - (endtime - time1));
}
}
} catch (Exception e) {
e.printStackTrace();
logger.error(e.getMessage());
} finally {
release();
logger.info("推流结束 设备信息:[ip:" + pojo.getIp() + " channel:" + pojo.getChannel() + " stream:"
+ pojo.getStream() + " starttime:" + pojo.getStarttime() + " endtime:" + pojo.getEndtime()
+ " rtsp:" + pojo.getRtsp() + " url:" + pojo.getUrl() + "]");
}
}
}
\ No newline at end of file
... ...
package com.junction.thread;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.junction.cache.CacheUtil;
import com.junction.controller.CameraController;
import com.junction.pojo.CameraPojo;
import com.junction.push.CameraPush;
/**
* @Title CameraThread.java
* @description TODO
* @time 2019年12月16日 上午9:32:43
* @author wuguodong
**/
public class CameraThread {
private final static Logger logger = LoggerFactory.getLogger(CameraThread.class);
public static class MyRunnable implements Runnable {
// 创建线程池
public static ExecutorService es = Executors.newCachedThreadPool();
private CameraPojo cameraPojo;
private Thread nowThread;
public MyRunnable(CameraPojo cameraPojo) {
this.cameraPojo = cameraPojo;
}
// 中断线程
public void setInterrupted(String key) {
CacheUtil.PUSHMAP.get(key).setExitcode(1);
}
@Override
public void run() {
// 直播流
try {
// 获取当前线程存入缓存
nowThread = Thread.currentThread();
CacheUtil.STREATMAP.put(cameraPojo.getToken(), cameraPojo);
// 执行转流推流任务
CameraPush push = new CameraPush(cameraPojo);
CacheUtil.PUSHMAP.put(cameraPojo.getToken(), push);
push.push();
// 清除缓存
CacheUtil.STREATMAP.remove(cameraPojo.getToken());
CameraController.JOBMAP.remove(cameraPojo.getToken());
CacheUtil.PUSHMAP.remove(cameraPojo.getToken());
} catch (Exception e) {
CacheUtil.STREATMAP.remove(cameraPojo.getToken());
CameraController.JOBMAP.remove(cameraPojo.getToken());
CacheUtil.PUSHMAP.remove(cameraPojo.getToken());
}
}
}
}
... ...
package com.junction.timer;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Set;
import java.util.Timer;
import java.util.TimerTask;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.stereotype.Component;
import com.junction.cache.CacheUtil;
import com.junction.controller.CameraController;
import com.junction.pojo.Config;
/**
* @Title TimerUtil.java
* @description 定时任务
* @time 2019年12月16日 下午3:10:08
* @author wuguodong
**/
@Component
public class CameraTimer implements CommandLineRunner {
private final static Logger logger = LoggerFactory.getLogger(CameraTimer.class);
@Autowired
private Config config;// 配置文件bean
public static Timer timer;
@Override
public void run(String... args) throws Exception {
// 超过5分钟,结束推流
timer = new Timer("timeTimer");
timer.schedule(new TimerTask() {
@Override
public void run() {
logger.info("定时任务 当前有" + CameraController.JOBMAP.size() + "个推流任务正在进行推流");
// 管理缓存
if (null != CacheUtil.STREATMAP && 0 != CacheUtil.STREATMAP.size()) {
Set<String> keys = CacheUtil.STREATMAP.keySet();
for (String key : keys) {
try {
// 最后打开时间
long openTime = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss")
.parse(CacheUtil.STREATMAP.get(key).getOpentime()).getTime();
// 当前系统时间
long newTime = new Date().getTime();
// 如果通道使用人数为0,则关闭推流
if (CacheUtil.STREATMAP.get(key).getCount() == 0) {
// 结束线程
CameraController.JOBMAP.get(key).setInterrupted(key);
logger.info("定时任务 当前设备使用人数为0结束推流 设备信息:[ip:" + CacheUtil.STREATMAP.get(key).getIp()
+ " channel:" + CacheUtil.STREATMAP.get(key).getChannel() + " stream:"
+ CacheUtil.STREATMAP.get(key).getStream() + " starttime:"
+ CacheUtil.STREATMAP.get(key).getStarttime() + " endtime:"
+ CacheUtil.STREATMAP.get(key).getEndtime() + " rtsp:"
+ CacheUtil.STREATMAP.get(key).getRtsp() + " url:"
+ CacheUtil.STREATMAP.get(key).getUrl() + "]");
} else if (null == CacheUtil.STREATMAP.get(key).getStarttime()
&& (newTime - openTime) / 1000 / 60 >= Integer.valueOf(config.getKeepalive())) {
CameraController.JOBMAP.get(key).setInterrupted(key);
logger.info("定时任务 当前设备使用时间超时结束推流 设备信息:[ip:" + CacheUtil.STREATMAP.get(key).getIp()
+ " channel:" + CacheUtil.STREATMAP.get(key).getChannel() + " stream:"
+ CacheUtil.STREATMAP.get(key).getStream() + " starttime:"
+ CacheUtil.STREATMAP.get(key).getStarttime() + " endtime:"
+ CacheUtil.STREATMAP.get(key).getEndtime() + " rtsp:"
+ CacheUtil.STREATMAP.get(key).getRtsp() + " url:"
+ CacheUtil.STREATMAP.get(key).getUrl() + "]");
}
} catch (ParseException e) {
e.printStackTrace();
}
}
}
}
}, 1, 1000 * 60);
}
}
... ...
package com.junction.util;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.alibaba.fastjson.JSONObject;
/**
* @Title Utils.java
* @description 工具类
* @time 2020年10月27日 上午9:15:56
* @author wuguodong
**/
public class Utils {
private final static Logger logger = LoggerFactory.getLogger(Utils.class);
/**
* @Title: IpConvert
* @Description:域名转ip
* @param domainName
* @return ip
**/
public static String IpConvert(String domainName) {
String ip = domainName;
try {
ip = InetAddress.getByName(domainName).getHostAddress();
} catch (UnknownHostException e) {
e.printStackTrace();
return domainName;
}
return ip;
}
/**
* @Title: CheckParameters
* @Description:接口参数非空校验
* @param cameraJson
* @param isNullArr
* @return boolean
**/
public static boolean isNullParameters(JSONObject cameraJson, String[] isNullArr) {
Map<String, Object> checkMap = new HashMap<>();
// 空值校验
for (String key : isNullArr) {
if (null == cameraJson.get(key) || "".equals(cameraJson.get(key))) {
return false;
}
}
return true;
}
/**
* @Title: isTrueIp
* @Description:接口参数ip格式校验
* @param ip
* @return boolean
**/
public static boolean isTrueIp(String ip) {
return ip.matches("([1-9]|[1-9]\\d|1\\d{2}|2[0-4]\\d|25[0-5])(\\.(\\d|[1-9]\\d|1\\d{2}|2[0-4]\\d|25[0-5])){3}");
}
/**
* @Title: isTrueTime
* @Description:接口参数时间格式校验
* @param time
* @return boolean
**/
public static boolean isTrueTime(String time) {
try {
new SimpleDateFormat("yyyy-MM-dd HH:ss:mm").parse(time);
return true;
} catch (Exception e) {
logger.error(e.getMessage());
return false;
}
}
/**
* @Title: getTime
* @Description:获取转换后的时间
* @param time
* @return String
**/
public static String getTime(String time) {
String timestamp = null;
try {
timestamp = new SimpleDateFormat("yyyyMMddHHmmss")
.format(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(time));
} catch (Exception e) {
logger.error("时间格式化错误");
e.printStackTrace();
}
return timestamp;
}
/**
* @Title: getStarttime
* @Description:获取回放开始时间
* @param starttime
* @return starttime
**/
public static String getStarttime(String time) {
String starttime = null;
try {
starttime = new SimpleDateFormat("yyyyMMddHHmmss")
.format(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(time).getTime() - 60 * 1000);
} catch (Exception e) {
logger.error("时间格式化错误");
e.printStackTrace();
}
return starttime;
}
/**
* @Title: getEndtime
* @Description:获取回放结束时间
* @param endString
* @return endString
**/
public static String getEndtime(String time) {
String endString = null;
try {
endString = new SimpleDateFormat("yyyyMMddHHmmss")
.format(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(time).getTime() + 60 * 1000);
} catch (Exception e) {
logger.error("时间格式化错误");
e.printStackTrace();
}
return endString;
}
}
... ...
package org.bytedeco.javacv;
import static org.bytedeco.ffmpeg.global.avcodec.AV_PKT_FLAG_KEY;
import static org.bytedeco.ffmpeg.global.avcodec.av_jni_set_java_vm;
import static org.bytedeco.ffmpeg.global.avcodec.av_packet_unref;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_alloc_context3;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_decode_audio4;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_decode_video2;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_find_decoder;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_find_decoder_by_name;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_flush_buffers;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_free_context;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_open2;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_parameters_to_context;
import static org.bytedeco.ffmpeg.global.avcodec.avcodec_register_all;
import static org.bytedeco.ffmpeg.global.avdevice.avdevice_register_all;
import static org.bytedeco.ffmpeg.global.avformat.AVSEEK_FLAG_BACKWARD;
import static org.bytedeco.ffmpeg.global.avformat.AVSEEK_SIZE;
import static org.bytedeco.ffmpeg.global.avformat.av_dump_format;
import static org.bytedeco.ffmpeg.global.avformat.av_find_input_format;
import static org.bytedeco.ffmpeg.global.avformat.av_guess_sample_aspect_ratio;
import static org.bytedeco.ffmpeg.global.avformat.av_read_frame;
import static org.bytedeco.ffmpeg.global.avformat.av_register_all;
import static org.bytedeco.ffmpeg.global.avformat.avformat_alloc_context;
import static org.bytedeco.ffmpeg.global.avformat.avformat_close_input;
import static org.bytedeco.ffmpeg.global.avformat.avformat_find_stream_info;
import static org.bytedeco.ffmpeg.global.avformat.avformat_free_context;
import static org.bytedeco.ffmpeg.global.avformat.avformat_network_init;
import static org.bytedeco.ffmpeg.global.avformat.avformat_open_input;
import static org.bytedeco.ffmpeg.global.avformat.avformat_seek_file;
import static org.bytedeco.ffmpeg.global.avformat.avio_alloc_context;
import static org.bytedeco.ffmpeg.global.avutil.AVMEDIA_TYPE_AUDIO;
import static org.bytedeco.ffmpeg.global.avutil.AVMEDIA_TYPE_VIDEO;
import static org.bytedeco.ffmpeg.global.avutil.AV_DICT_IGNORE_SUFFIX;
import static org.bytedeco.ffmpeg.global.avutil.AV_LOG_INFO;
import static org.bytedeco.ffmpeg.global.avutil.AV_NOPTS_VALUE;
import static org.bytedeco.ffmpeg.global.avutil.AV_PICTURE_TYPE_I;
import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_BGR24;
import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_GRAY8;
import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_NONE;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_DBL;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_DBLP;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_FLT;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_FLTP;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_NONE;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_S16;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_S16P;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_S32;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_S32P;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_U8;
import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_U8P;
import static org.bytedeco.ffmpeg.global.avutil.AV_TIME_BASE;
import static org.bytedeco.ffmpeg.global.avutil.av_d2q;
import static org.bytedeco.ffmpeg.global.avutil.av_dict_free;
import static org.bytedeco.ffmpeg.global.avutil.av_dict_get;
import static org.bytedeco.ffmpeg.global.avutil.av_dict_set;
import static org.bytedeco.ffmpeg.global.avutil.av_frame_alloc;
import static org.bytedeco.ffmpeg.global.avutil.av_frame_free;
import static org.bytedeco.ffmpeg.global.avutil.av_frame_get_best_effort_timestamp;
import static org.bytedeco.ffmpeg.global.avutil.av_frame_unref;
import static org.bytedeco.ffmpeg.global.avutil.av_free;
import static org.bytedeco.ffmpeg.global.avutil.av_get_bytes_per_sample;
import static org.bytedeco.ffmpeg.global.avutil.av_get_default_channel_layout;
import static org.bytedeco.ffmpeg.global.avutil.av_get_pix_fmt_name;
import static org.bytedeco.ffmpeg.global.avutil.av_image_fill_arrays;
import static org.bytedeco.ffmpeg.global.avutil.av_image_fill_linesizes;
import static org.bytedeco.ffmpeg.global.avutil.av_image_get_buffer_size;
import static org.bytedeco.ffmpeg.global.avutil.av_log_get_level;
import static org.bytedeco.ffmpeg.global.avutil.av_malloc;
import static org.bytedeco.ffmpeg.global.avutil.av_sample_fmt_is_planar;
import static org.bytedeco.ffmpeg.global.avutil.av_samples_get_buffer_size;
import static org.bytedeco.ffmpeg.global.swresample.swr_alloc_set_opts;
import static org.bytedeco.ffmpeg.global.swresample.swr_convert;
import static org.bytedeco.ffmpeg.global.swresample.swr_free;
import static org.bytedeco.ffmpeg.global.swresample.swr_get_out_samples;
import static org.bytedeco.ffmpeg.global.swresample.swr_init;
import static org.bytedeco.ffmpeg.global.swscale.SWS_BILINEAR;
import static org.bytedeco.ffmpeg.global.swscale.sws_freeContext;
import static org.bytedeco.ffmpeg.global.swscale.sws_getCachedContext;
import static org.bytedeco.ffmpeg.global.swscale.sws_scale;
import java.io.BufferedInputStream;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.nio.Buffer;
import java.nio.ByteBuffer;
import java.util.Collections;
import java.util.EnumSet;
import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
import org.bytedeco.ffmpeg.avcodec.AVCodec;
import org.bytedeco.ffmpeg.avcodec.AVCodecContext;
import org.bytedeco.ffmpeg.avcodec.AVCodecParameters;
import org.bytedeco.ffmpeg.avcodec.AVPacket;
import org.bytedeco.ffmpeg.avformat.AVFormatContext;
import org.bytedeco.ffmpeg.avformat.AVIOContext;
import org.bytedeco.ffmpeg.avformat.AVInputFormat;
import org.bytedeco.ffmpeg.avformat.AVStream;
import org.bytedeco.ffmpeg.avformat.Read_packet_Pointer_BytePointer_int;
import org.bytedeco.ffmpeg.avformat.Seek_Pointer_long_int;
import org.bytedeco.ffmpeg.avutil.AVDictionary;
import org.bytedeco.ffmpeg.avutil.AVDictionaryEntry;
import org.bytedeco.ffmpeg.avutil.AVFrame;
import org.bytedeco.ffmpeg.avutil.AVRational;
import org.bytedeco.ffmpeg.swresample.SwrContext;
import org.bytedeco.ffmpeg.swscale.SwsContext;
import org.bytedeco.javacpp.BytePointer;
import org.bytedeco.javacpp.DoublePointer;
import org.bytedeco.javacpp.IntPointer;
import org.bytedeco.javacpp.Loader;
import org.bytedeco.javacpp.Pointer;
import org.bytedeco.javacpp.PointerPointer;
import org.bytedeco.javacpp.PointerScope;
public class FFmpegFrameGrabber extends FrameGrabber {
public static String[] getDeviceDescriptions() throws Exception {
tryLoad();
throw new UnsupportedOperationException("Device enumeration not support by FFmpeg.");
}
public static FFmpegFrameGrabber createDefault(File deviceFile) throws Exception {
return new FFmpegFrameGrabber(deviceFile);
}
public static FFmpegFrameGrabber createDefault(String devicePath) throws Exception {
return new FFmpegFrameGrabber(devicePath);
}
public static FFmpegFrameGrabber createDefault(int deviceNumber) throws Exception {
throw new Exception(FFmpegFrameGrabber.class + " does not support device numbers.");
}
private static Exception loadingException = null;
public static void tryLoad() throws Exception {
if (loadingException != null) {
throw loadingException;
} else {
try {
Loader.load(org.bytedeco.ffmpeg.global.avutil.class);
Loader.load(org.bytedeco.ffmpeg.global.swresample.class);
Loader.load(org.bytedeco.ffmpeg.global.avcodec.class);
Loader.load(org.bytedeco.ffmpeg.global.avformat.class);
Loader.load(org.bytedeco.ffmpeg.global.swscale.class);
// Register all formats and codecs
av_jni_set_java_vm(Loader.getJavaVM(), null);
avcodec_register_all();
av_register_all();
avformat_network_init();
Loader.load(org.bytedeco.ffmpeg.global.avdevice.class);
avdevice_register_all();
} catch (Throwable t) {
if (t instanceof Exception) {
throw loadingException = (Exception) t;
} else {
throw loadingException = new Exception("Failed to load " + FFmpegFrameGrabber.class, t);
}
}
}
}
static {
try {
tryLoad();
FFmpegLockCallback.init();
} catch (Exception ex) {
}
}
public FFmpegFrameGrabber(File file) {
this(file.getAbsolutePath());
}
public FFmpegFrameGrabber(String filename) {
this.filename = filename;
this.pixelFormat = AV_PIX_FMT_NONE;
this.sampleFormat = AV_SAMPLE_FMT_NONE;
}
/**
* Calls {@code FFmpegFrameGrabber(inputStream, Integer.MAX_VALUE - 8)} so that
* the whole input stream is seekable.
*/
public FFmpegFrameGrabber(InputStream inputStream) {
this(inputStream, Integer.MAX_VALUE - 8);
}
public FFmpegFrameGrabber(InputStream inputStream, int maximumSize) {
this.inputStream = inputStream;
this.closeInputStream = true;
this.pixelFormat = AV_PIX_FMT_NONE;
this.sampleFormat = AV_SAMPLE_FMT_NONE;
this.maximumSize = maximumSize;
}
public void release() throws Exception {
synchronized (org.bytedeco.ffmpeg.global.avcodec.class) {
releaseUnsafe();
}
}
public void releaseUnsafe() throws Exception {
if (pkt != null && pkt2 != null) {
if (pkt2.size() > 0) {
av_packet_unref(pkt);
}
pkt = pkt2 = null;
}
// Free the RGB image
if (image_ptr != null) {
for (int i = 0; i < image_ptr.length; i++) {
av_free(image_ptr[i]);
}
image_ptr = null;
}
if (picture_rgb != null) {
av_frame_free(picture_rgb);
picture_rgb = null;
}
// Free the native format picture frame
if (picture != null) {
av_frame_free(picture);
picture = null;
}
// Close the video codec
if (video_c != null) {
avcodec_free_context(video_c);
video_c = null;
}
// Free the audio samples frame
if (samples_frame != null) {
av_frame_free(samples_frame);
samples_frame = null;
}
// Close the audio codec
if (audio_c != null) {
avcodec_free_context(audio_c);
audio_c = null;
}
// Close the video file
if (inputStream == null && oc != null && !oc.isNull()) {
avformat_close_input(oc);
oc = null;
}
if (img_convert_ctx != null) {
sws_freeContext(img_convert_ctx);
img_convert_ctx = null;
}
if (samples_ptr_out != null) {
for (int i = 0; i < samples_ptr_out.length; i++) {
av_free(samples_ptr_out[i].position(0));
}
samples_ptr_out = null;
samples_buf_out = null;
}
if (samples_convert_ctx != null) {
swr_free(samples_convert_ctx);
samples_convert_ctx = null;
}
got_frame = null;
frameGrabbed = false;
frame = null;
timestamp = 0;
frameNumber = 0;
if (inputStream != null) {
try {
if (oc == null) {
// when called a second time
if (closeInputStream) {
inputStream.close();
}
} else {
inputStream.reset();
}
} catch (IOException ex) {
throw new Exception("Error on InputStream.close(): ", ex);
} finally {
inputStreams.remove(oc);
if (avio != null) {
if (avio.buffer() != null) {
av_free(avio.buffer());
avio.buffer(null);
}
av_free(avio);
avio = null;
}
if (oc != null) {
avformat_free_context(oc);
oc = null;
}
}
}
}
@Override
protected void finalize() throws Throwable {
super.finalize();
release();
}
static Map<Pointer, InputStream> inputStreams = Collections.synchronizedMap(new HashMap<Pointer, InputStream>());
static class ReadCallback extends Read_packet_Pointer_BytePointer_int {
@Override
public int call(Pointer opaque, BytePointer buf, int buf_size) {
try {
byte[] b = new byte[buf_size];
InputStream is = inputStreams.get(opaque);
int size = is.read(b, 0, buf_size);
if (size < 0) {
return 0;
} else {
buf.put(b, 0, size);
return size;
}
} catch (Throwable t) {
System.err.println("Error on InputStream.read(): " + t);
return -1;
}
}
}
static class SeekCallback extends Seek_Pointer_long_int {
@Override
public long call(Pointer opaque, long offset, int whence) {
try {
InputStream is = inputStreams.get(opaque);
long size = 0;
switch (whence) {
case 0:
is.reset();
break; // SEEK_SET
case 1:
break; // SEEK_CUR
case 2: // SEEK_END
is.reset();
while (true) {
long n = is.skip(Long.MAX_VALUE);
if (n == 0)
break;
size += n;
}
offset += size;
is.reset();
break;
case AVSEEK_SIZE:
long remaining = 0;
while (true) {
long n = is.skip(Long.MAX_VALUE);
if (n == 0)
break;
remaining += n;
}
is.reset();
while (true) {
long n = is.skip(Long.MAX_VALUE);
if (n == 0)
break;
size += n;
}
offset = size - remaining;
is.reset();
break;
default:
return -1;
}
long remaining = offset;
while (remaining > 0) {
long skipped = is.skip(remaining);
if (skipped == 0)
break; // end of the stream
remaining -= skipped;
}
return whence == AVSEEK_SIZE ? size : 0;
} catch (Throwable t) {
System.err.println("Error on InputStream.reset() or skip(): " + t);
return -1;
}
}
}
static ReadCallback readCallback = new ReadCallback();
static SeekCallback seekCallback = new SeekCallback();
static {
PointerScope s = PointerScope.getInnerScope();
if (s != null) {
s.detach(readCallback);
s.detach(seekCallback);
}
}
private InputStream inputStream;
private boolean closeInputStream;
private int maximumSize;
private AVIOContext avio;
private String filename;
private AVFormatContext oc;
private AVStream video_st, audio_st;
private AVCodecContext video_c, audio_c;
private AVFrame picture, picture_rgb;
private BytePointer[] image_ptr;
private Buffer[] image_buf;
private AVFrame samples_frame;
private BytePointer[] samples_ptr;
private Buffer[] samples_buf;
private BytePointer[] samples_ptr_out;
private Buffer[] samples_buf_out;
private AVPacket pkt, pkt2;
private int sizeof_pkt;
private int[] got_frame;
private SwsContext img_convert_ctx;
private SwrContext samples_convert_ctx;
private int samples_channels, samples_format, samples_rate;
private boolean frameGrabbed;
private Frame frame;
public boolean isCloseInputStream() {
return closeInputStream;
}
public void setCloseInputStream(boolean closeInputStream) {
this.closeInputStream = closeInputStream;
}
/**
* Is there a video stream?
*
* @return {@code video_st!=null;}
*/
public boolean hasVideo() {
return video_st != null;
}
/**
* Is there an audio stream?
*
* @return {@code audio_st!=null;}
*/
public boolean hasAudio() {
return audio_st != null;
}
@Override
public double getGamma() {
// default to a gamma of 2.2 for cheap Webcams, DV cameras, etc.
if (gamma == 0.0) {
return 2.2;
} else {
return gamma;
}
}
@Override
public String getFormat() {
if (oc == null) {
return super.getFormat();
} else {
return oc.iformat().name().getString();
}
}
@Override
public int getImageWidth() {
return imageWidth > 0 || video_c == null ? super.getImageWidth() : video_c.width();
}
@Override
public int getImageHeight() {
return imageHeight > 0 || video_c == null ? super.getImageHeight() : video_c.height();
}
@Override
public int getAudioChannels() {
return audioChannels > 0 || audio_c == null ? super.getAudioChannels() : audio_c.channels();
}
@Override
public int getPixelFormat() {
if (imageMode == ImageMode.COLOR || imageMode == ImageMode.GRAY) {
if (pixelFormat == AV_PIX_FMT_NONE) {
return imageMode == ImageMode.COLOR ? AV_PIX_FMT_BGR24 : AV_PIX_FMT_GRAY8;
} else {
return pixelFormat;
}
} else if (video_c != null) { // RAW
return video_c.pix_fmt();
} else {
return super.getPixelFormat();
}
}
@Override
public int getVideoCodec() {
return video_c == null ? super.getVideoCodec() : video_c.codec_id();
}
@Override
public int getVideoBitrate() {
return video_c == null ? super.getVideoBitrate() : (int) video_c.bit_rate();
}
@Override
public double getAspectRatio() {
if (video_st == null) {
return super.getAspectRatio();
} else {
AVRational r = av_guess_sample_aspect_ratio(oc, video_st, picture);
double a = (double) r.num() / r.den();
return a == 0.0 ? 1.0 : a;
}
}
/** Returns {@link #getVideoFrameRate()} */
@Override
public double getFrameRate() {
return getVideoFrameRate();
}
/**
* Estimation of audio frames per second
*
* @return (double) getSampleRate()) / samples_frame.nb_samples() if
* samples_frame.nb_samples() is not zero, otherwise return 0
*/
public double getAudioFrameRate() {
if (audio_st == null) {
return 0.0;
} else {
if (samples_frame == null || samples_frame.nb_samples() == 0) {
try {
grabFrame(true, false, false, false);
frameGrabbed = true;
} catch (Exception e) {
return 0.0;
}
}
if (samples_frame != null || samples_frame.nb_samples() != 0)
return ((double) getSampleRate()) / samples_frame.nb_samples();
else
return 0.0;
}
}
public double getVideoFrameRate() {
if (video_st == null) {
return super.getFrameRate();
} else {
AVRational r = video_st.avg_frame_rate();
if (r.num() == 0 && r.den() == 0) {
r = video_st.r_frame_rate();
}
return (double) r.num() / r.den();
}
}
@Override
public int getAudioCodec() {
return audio_c == null ? super.getAudioCodec() : audio_c.codec_id();
}
@Override
public int getAudioBitrate() {
return audio_c == null ? super.getAudioBitrate() : (int) audio_c.bit_rate();
}
@Override
public int getSampleFormat() {
if (sampleMode == SampleMode.SHORT || sampleMode == SampleMode.FLOAT) {
if (sampleFormat == AV_SAMPLE_FMT_NONE) {
return sampleMode == SampleMode.SHORT ? AV_SAMPLE_FMT_S16 : AV_SAMPLE_FMT_FLT;
} else {
return sampleFormat;
}
} else if (audio_c != null) { // RAW
return audio_c.sample_fmt();
} else {
return super.getSampleFormat();
}
}
@Override
public int getSampleRate() {
return sampleRate > 0 || audio_c == null ? super.getSampleRate() : audio_c.sample_rate();
}
@Override
public Map<String, String> getMetadata() {
if (oc == null) {
return super.getMetadata();
}
AVDictionaryEntry entry = null;
Map<String, String> metadata = new HashMap<String, String>();
while ((entry = av_dict_get(oc.metadata(), "", entry, AV_DICT_IGNORE_SUFFIX)) != null) {
metadata.put(entry.key().getString(), entry.value().getString());
}
return metadata;
}
@Override
public Map<String, String> getVideoMetadata() {
if (video_st == null) {
return super.getVideoMetadata();
}
AVDictionaryEntry entry = null;
Map<String, String> metadata = new HashMap<String, String>();
while ((entry = av_dict_get(video_st.metadata(), "", entry, AV_DICT_IGNORE_SUFFIX)) != null) {
metadata.put(entry.key().getString(), entry.value().getString());
}
return metadata;
}
@Override
public Map<String, String> getAudioMetadata() {
if (audio_st == null) {
return super.getAudioMetadata();
}
AVDictionaryEntry entry = null;
Map<String, String> metadata = new HashMap<String, String>();
while ((entry = av_dict_get(audio_st.metadata(), "", entry, AV_DICT_IGNORE_SUFFIX)) != null) {
metadata.put(entry.key().getString(), entry.value().getString());
}
return metadata;
}
@Override
public String getMetadata(String key) {
if (oc == null) {
return super.getMetadata(key);
}
AVDictionaryEntry entry = av_dict_get(oc.metadata(), key, null, 0);
return entry == null || entry.value() == null ? null : entry.value().getString();
}
@Override
public String getVideoMetadata(String key) {
if (video_st == null) {
return super.getVideoMetadata(key);
}
AVDictionaryEntry entry = av_dict_get(video_st.metadata(), key, null, 0);
return entry == null || entry.value() == null ? null : entry.value().getString();
}
@Override
public String getAudioMetadata(String key) {
if (audio_st == null) {
return super.getAudioMetadata(key);
}
AVDictionaryEntry entry = av_dict_get(audio_st.metadata(), key, null, 0);
return entry == null || entry.value() == null ? null : entry.value().getString();
}
/**
* default override of super.setFrameNumber implies setting of a frame close to
* a video frame having that number
*/
@Override
public void setFrameNumber(int frameNumber) throws Exception {
if (hasVideo())
setTimestamp(Math.round(1000000L * frameNumber / getFrameRate()));
else
super.frameNumber = frameNumber;
}
/**
* if there is video stream tries to seek to video frame with corresponding
* timestamp otherwise sets super.frameNumber only because frameRate==0 if there
* is no video stream
*/
public void setVideoFrameNumber(int frameNumber) throws Exception {
// best guess, AVSEEK_FLAG_FRAME has not been implemented in FFmpeg...
if (hasVideo())
setVideoTimestamp(Math.round(1000000L * frameNumber / getFrameRate()));
else
super.frameNumber = frameNumber;
}
/**
* if there is audio stream tries to seek to audio frame with corresponding
* timestamp ignoring otherwise
*/
public void setAudioFrameNumber(int frameNumber) throws Exception {
// best guess, AVSEEK_FLAG_FRAME has not been implemented in FFmpeg...
if (hasAudio())
setAudioTimestamp(Math.round(1000000L * frameNumber / getAudioFrameRate()));
}
/**
* setTimestamp without checking frame content (using old code used in JavaCV
* versions prior to 1.4.1)
*/
@Override
public void setTimestamp(long timestamp) throws Exception {
setTimestamp(timestamp, false);
}
/**
* setTimestamp with possibility to select between old quick seek code or new
* code doing check of frame content. The frame check can be useful with
* corrupted files, when seeking may end up with an empty frame not containing
* video nor audio
*/
public void setTimestamp(long timestamp, boolean checkFrame) throws Exception {
setTimestamp(timestamp, checkFrame ? EnumSet.of(Frame.Type.VIDEO, Frame.Type.AUDIO) : null);
}
/** setTimestamp with resulting video frame type if there is a video stream */
public void setVideoTimestamp(long timestamp) throws Exception {
setTimestamp(timestamp, EnumSet.of(Frame.Type.VIDEO));
}
/** setTimestamp with resulting audio frame type if there is an audio stream */
public void setAudioTimestamp(long timestamp) throws Exception {
setTimestamp(timestamp, EnumSet.of(Frame.Type.AUDIO));
}
/**
* setTimestamp with a priority the resulting frame should be: video
* (frameTypesToSeek contains only Frame.Type.VIDEO), audio (frameTypesToSeek
* contains only Frame.Type.AUDIO), or any (frameTypesToSeek contains both)
*/
private void setTimestamp(long timestamp, EnumSet<Frame.Type> frameTypesToSeek) throws Exception {
int ret;
if (oc == null) {
super.setTimestamp(timestamp);
} else {
timestamp = timestamp * AV_TIME_BASE / 1000000L;
/* add the stream start time */
if (oc.start_time() != AV_NOPTS_VALUE) {
timestamp += oc.start_time();
}
if ((ret = avformat_seek_file(oc, -1, Long.MIN_VALUE, timestamp, Long.MAX_VALUE,
AVSEEK_FLAG_BACKWARD)) < 0) {
throw new Exception(
"avformat_seek_file() error " + ret + ": Could not seek file to timestamp " + timestamp + ".");
}
if (video_c != null) {
avcodec_flush_buffers(video_c);
}
if (audio_c != null) {
avcodec_flush_buffers(audio_c);
}
if (pkt2.size() > 0) {
pkt2.size(0);
av_packet_unref(pkt);
}
/*
* After the call of ffmpeg's avformat_seek_file(...) with the flag set to
* AVSEEK_FLAG_BACKWARD the decoding position should be located before the
* requested timestamp in a closest position from which all the active streams
* can be decoded successfully. The following seeking consists of two stages: 1.
* Grab frames till the frame corresponding to that "closest" position (the
* first frame containing decoded data).
*
* 2. Grab frames till the desired timestamp is reached. The number of steps is
* restricted by doubled estimation of frames between that "closest" position
* and the desired position.
*
* frameTypesToSeek parameter sets the preferred type of frames to seek. It can
* be chosen from three possible types: VIDEO, AUDIO or any of them. The setting
* means only a preference in the type. That is, if VIDEO or AUDIO is specified
* but the file does not have video or audio stream - any type will be used
* instead.
*
*
* TODO Sometimes the ffmpeg's avformat_seek_file(...) function brings us not to
* a position before the desired but few frames after.... What can be a the
* solution in this case if we really need a frame-precision seek? Probably we
* may try to request even earlier timestamp and look if this will bring us
* before the desired position.
*
*/
if (frameTypesToSeek != null) { // new code providing check of frame content while seeking to the timestamp
boolean has_video = hasVideo();
boolean has_audio = hasAudio();
if (has_video || has_audio) {
if ((frameTypesToSeek.contains(Frame.Type.VIDEO) && !has_video)
|| (frameTypesToSeek.contains(Frame.Type.AUDIO) && !has_audio))
frameTypesToSeek = EnumSet.of(Frame.Type.VIDEO, Frame.Type.AUDIO);
long initialSeekPosition = Long.MIN_VALUE;
long maxSeekSteps = 0;
long count = 0;
Frame seekFrame = null;
while (count++ < 1000) { // seek to a first frame containing video or audio after
// avformat_seek_file(...)
seekFrame = grabFrame(true, true, false, false);
if (seekFrame == null)
return; // is it better to throw NullPointerException?
EnumSet<Frame.Type> frameTypes = seekFrame.getTypes();
frameTypes.retainAll(frameTypesToSeek);
if (!frameTypes.isEmpty()) {
initialSeekPosition = seekFrame.timestamp;
// the position closest to the requested timestamp from which it can be reached
// by sequential grabFrame calls
break;
}
}
if (has_video && this.getFrameRate() > 0) {
// estimation of video frame duration
double deltaTimeStamp = 1000000.0 / this.getFrameRate();
if (initialSeekPosition < timestamp - deltaTimeStamp / 2)
maxSeekSteps = (long) (10 * (timestamp - initialSeekPosition) / deltaTimeStamp);
} else if (has_audio && this.getAudioFrameRate() > 0) {
// estimation of audio frame duration
double deltaTimeStamp = 1000000.0 / this.getAudioFrameRate();
if (initialSeekPosition < timestamp - deltaTimeStamp / 2)
maxSeekSteps = (long) (10 * (timestamp - initialSeekPosition) / deltaTimeStamp);
} else
// zero frameRate
if (initialSeekPosition < timestamp - 1L)
maxSeekSteps = 1000;
count = 0;
while (count < maxSeekSteps) {
seekFrame = grabFrame(true, true, false, false);
if (seekFrame == null)
return; // is it better to throw NullPointerException?
EnumSet<Frame.Type> frameTypes = seekFrame.getTypes();
frameTypes.retainAll(frameTypesToSeek);
if (!frameTypes.isEmpty()) {
count++;
if (this.timestamp >= timestamp - 1)
break;
}
}
frameGrabbed = true;
}
} else { // old quick seeking code used in JavaCV versions prior to 1.4.1
/*
* comparing to timestamp +/- 1 avoids rouding issues for framerates which are
* no proper divisors of 1000000, e.g. where av_frame_get_best_effort_timestamp
* in grabFrame sets this.timestamp to ...666 and the given timestamp has been
* rounded to ...667 (or vice versa)
*/
int count = 0; // prevent infinite loops with corrupted files
while (this.timestamp > timestamp + 1 && grabFrame(true, true, false, false) != null
&& count++ < 1000) {
// flush frames if seeking backwards
}
count = 0;
while (this.timestamp < timestamp - 1 && grabFrame(true, true, false, false) != null
&& count++ < 1000) {
// decode up to the desired frame
}
frameGrabbed = true;
}
}
}
/** Returns {@link #getLengthInVideoFrames()} */
@Override
public int getLengthInFrames() {
// best guess...
return getLengthInVideoFrames();
}
@Override
public long getLengthInTime() {
return oc.duration() * 1000000L / AV_TIME_BASE;
}
/**
* Returns
* {@code (int) Math.round(getLengthInTime() * getFrameRate() / 1000000L)},
* which is an approximation in general.
*/
public int getLengthInVideoFrames() {
// best guess...
return (int) Math.round(getLengthInTime() * getFrameRate() / 1000000L);
}
public int getLengthInAudioFrames() {
// best guess...
double afr = getAudioFrameRate();
if (afr > 0)
return (int) (getLengthInTime() * afr / 1000000L);
else
return 0;
}
public AVFormatContext getFormatContext() {
return oc;
}
public void start(String streamCode) throws Exception {
synchronized (org.bytedeco.ffmpeg.global.avcodec.class) {
startUnsafe(streamCode);
}
}
public void startUnsafe(String streamCode) throws Exception {
if (oc != null && !oc.isNull()) {
throw new Exception("start() has already been called: Call stop() before calling start() again.");
}
int ret;
img_convert_ctx = null;
oc = new AVFormatContext(null);
video_c = null;
audio_c = null;
pkt = new AVPacket();
pkt2 = new AVPacket();
sizeof_pkt = pkt.sizeof();
got_frame = new int[1];
frameGrabbed = false;
frame = new Frame();
timestamp = 0;
frameNumber = 0;
pkt2.size(0);
// Open video file
AVInputFormat f = null;
if (format != null && format.length() > 0) {
if ((f = av_find_input_format(format)) == null) {
throw new Exception("av_find_input_format() error: Could not find input format \"" + format + "\".");
}
}
AVDictionary options = new AVDictionary(null);
if (frameRate > 0) {
AVRational r = av_d2q(frameRate, 1001000);
av_dict_set(options, "framerate", r.num() + "/" + r.den(), 0);
}
if (pixelFormat >= 0) {
av_dict_set(options, "pixel_format", av_get_pix_fmt_name(pixelFormat).getString(), 0);
} else if (imageMode != ImageMode.RAW) {
av_dict_set(options, "pixel_format", imageMode == ImageMode.COLOR ? "bgr24" : "gray8", 0);
}
if (imageWidth > 0 && imageHeight > 0) {
av_dict_set(options, "video_size", imageWidth + "x" + imageHeight, 0);
}
if (sampleRate > 0) {
av_dict_set(options, "sample_rate", "" + sampleRate, 0);
}
if (audioChannels > 0) {
av_dict_set(options, "channels", "" + audioChannels, 0);
}
for (Entry<String, String> e : this.options.entrySet()) {
av_dict_set(options, e.getKey(), e.getValue(), 0);
}
if (inputStream != null) {
if (!inputStream.markSupported()) {
inputStream = new BufferedInputStream(inputStream);
}
inputStream.mark(maximumSize);
oc = avformat_alloc_context();
avio = avio_alloc_context(new BytePointer(av_malloc(4096)), 4096, 0, oc, readCallback, null, seekCallback);
oc.pb(avio);
filename = inputStream.toString();
inputStreams.put(oc, inputStream);
}
if ((ret = avformat_open_input(oc, filename, f, options)) < 0) {
av_dict_set(options, "pixel_format", null, 0);
if ((ret = avformat_open_input(oc, filename, f, options)) < 0) {
throw new Exception("avformat_open_input() error " + ret + ": Could not open input \"" + filename
+ "\". (Has setFormat() been called?)");
}
}
av_dict_free(options);
oc.max_delay(maxDelay);
// Retrieve stream information
// 限制avformat_find_stream_info接口内部读取的最大数据量
oc.probesize(Integer.parseInt(streamCode));
// 设置avformat_find_stream_info这个函数的持续时长,超过这个时间不结束也会结束
oc.max_analyze_duration(5 * AV_TIME_BASE);
// 将avformat_find_stream_info内部读取的数据包不放入AVFormatContext的缓冲区packet_buffer中
// oc.flags(AVFormatContext.AVFMT_FLAG_NOBUFFER);
AVDictionary optionOut = new AVDictionary(null);
if ((ret = avformat_find_stream_info(oc, (PointerPointer) null)) < 0) {
throw new Exception("avformat_find_stream_info() error " + ret + ": Could not find stream information.");
}
if (av_log_get_level() >= AV_LOG_INFO) {
// Dump information about file onto standard error
av_dump_format(oc, 0, filename, 0);
}
// Find the first video and audio stream, unless the user specified otherwise
video_st = audio_st = null;
AVCodecParameters video_par = null, audio_par = null;
int nb_streams = oc.nb_streams();
for (int i = 0; i < nb_streams; i++) {
AVStream st = oc.streams(i);
// Get a pointer to the codec context for the video or audio stream
AVCodecParameters par = st.codecpar();
if (video_st == null && par.codec_type() == AVMEDIA_TYPE_VIDEO && (videoStream < 0 || videoStream == i)) {
video_st = st;
video_par = par;
videoStream = i;
} else if (audio_st == null && par.codec_type() == AVMEDIA_TYPE_AUDIO
&& (audioStream < 0 || audioStream == i)) {
audio_st = st;
audio_par = par;
audioStream = i;
}
}
if (video_st == null && audio_st == null) {
throw new Exception("Did not find a video or audio stream inside \"" + filename + "\" for videoStream == "
+ videoStream + " and audioStream == " + audioStream + ".");
}
if (video_st != null) {
// Find the decoder for the video stream
AVCodec codec = avcodec_find_decoder_by_name(videoCodecName);
if (codec == null) {
codec = avcodec_find_decoder(video_par.codec_id());
}
if (codec == null) {
throw new Exception("avcodec_find_decoder() error: Unsupported video format or codec not found: "
+ video_par.codec_id() + ".");
}
/* Allocate a codec context for the decoder */
if ((video_c = avcodec_alloc_context3(codec)) == null) {
throw new Exception("avcodec_alloc_context3() error: Could not allocate video decoding context.");
}
/* copy the stream parameters from the muxer */
if ((ret = avcodec_parameters_to_context(video_c, video_st.codecpar())) < 0) {
releaseUnsafe();
throw new Exception(
"avcodec_parameters_to_context() error: Could not copy the video stream parameters.");
}
options = new AVDictionary(null);
for (Entry<String, String> e : videoOptions.entrySet()) {
av_dict_set(options, e.getKey(), e.getValue(), 0);
}
// Enable multithreading when available
video_c.thread_count(0);
// Open video codec
if ((ret = avcodec_open2(video_c, codec, options)) < 0) {
throw new Exception("avcodec_open2() error " + ret + ": Could not open video codec.");
}
av_dict_free(options);
// Hack to correct wrong frame rates that seem to be generated by some codecs
if (video_c.time_base().num() > 1000 && video_c.time_base().den() == 1) {
video_c.time_base().den(1000);
}
// Allocate video frame and an AVFrame structure for the RGB image
if ((picture = av_frame_alloc()) == null) {
throw new Exception("av_frame_alloc() error: Could not allocate raw picture frame.");
}
if ((picture_rgb = av_frame_alloc()) == null) {
throw new Exception("av_frame_alloc() error: Could not allocate RGB picture frame.");
}
initPictureRGB();
}
if (audio_st != null) {
// Find the decoder for the audio stream
AVCodec codec = avcodec_find_decoder_by_name(audioCodecName);
if (codec == null) {
codec = avcodec_find_decoder(audio_par.codec_id());
}
if (codec == null) {
// throw new Exception("avcodec_find_decoder() error: Unsupported audio format or codec not found: "
// + audio_par.codec_id() + ".");
} else {
/* Allocate a codec context for the decoder */
if ((audio_c = avcodec_alloc_context3(codec)) == null) {
throw new Exception("avcodec_alloc_context3() error: Could not allocate audio decoding context.");
}
/* copy the stream parameters from the muxer */
if ((ret = avcodec_parameters_to_context(audio_c, audio_st.codecpar())) < 0) {
releaseUnsafe();
throw new Exception(
"avcodec_parameters_to_context() error: Could not copy the audio stream parameters.");
}
options = new AVDictionary(null);
for (Entry<String, String> e : audioOptions.entrySet()) {
av_dict_set(options, e.getKey(), e.getValue(), 0);
}
// Enable multithreading when available
audio_c.thread_count(0);
// Open audio codec
if ((ret = avcodec_open2(audio_c, codec, options)) < 0) {
throw new Exception("avcodec_open2() error " + ret + ": Could not open audio codec.");
}
av_dict_free(options);
// Allocate audio samples frame
if ((samples_frame = av_frame_alloc()) == null) {
throw new Exception("av_frame_alloc() error: Could not allocate audio frame.");
}
samples_ptr = new BytePointer[] { null };
samples_buf = new Buffer[] { null };
}
}
}
private void initPictureRGB() {
int width = imageWidth > 0 ? imageWidth : video_c.width();
int height = imageHeight > 0 ? imageHeight : video_c.height();
switch (imageMode) {
case COLOR:
case GRAY:
// If size changes I new allocation is needed -> free the old one.
if (image_ptr != null) {
// First kill all references, then free it.
image_buf = null;
BytePointer[] temp = image_ptr;
image_ptr = null;
av_free(temp[0]);
}
int fmt = getPixelFormat();
// work around bug in swscale: https://trac.ffmpeg.org/ticket/1031
int align = 32;
int stride = width;
for (int i = 1; i <= align; i += i) {
stride = (width + (i - 1)) & ~(i - 1);
av_image_fill_linesizes(picture_rgb.linesize(), fmt, stride);
if ((picture_rgb.linesize(0) & (align - 1)) == 0) {
break;
}
}
// Determine required buffer size and allocate buffer
int size = av_image_get_buffer_size(fmt, stride, height, 1);
image_ptr = new BytePointer[] { new BytePointer(av_malloc(size)).capacity(size) };
image_buf = new Buffer[] { image_ptr[0].asBuffer() };
// Assign appropriate parts of buffer to image planes in picture_rgb
// Note that picture_rgb is an AVFrame, but AVFrame is a superset of AVPicture
av_image_fill_arrays(new PointerPointer(picture_rgb), picture_rgb.linesize(), image_ptr[0], fmt, stride,
height, 1);
picture_rgb.format(fmt);
picture_rgb.width(width);
picture_rgb.height(height);
break;
case RAW:
image_ptr = new BytePointer[] { null };
image_buf = new Buffer[] { null };
break;
default:
assert false;
}
}
public void stop() throws Exception {
release();
}
public void trigger() throws Exception {
if (oc == null || oc.isNull()) {
throw new Exception("Could not trigger: No AVFormatContext. (Has start() been called?)");
}
if (pkt2.size() > 0) {
pkt2.size(0);
av_packet_unref(pkt);
}
for (int i = 0; i < numBuffers + 1; i++) {
if (av_read_frame(oc, pkt) < 0) {
return;
}
av_packet_unref(pkt);
}
}
private void processImage() throws Exception {
frame.imageWidth = imageWidth > 0 ? imageWidth : video_c.width();
frame.imageHeight = imageHeight > 0 ? imageHeight : video_c.height();
frame.imageDepth = Frame.DEPTH_UBYTE;
switch (imageMode) {
case COLOR:
case GRAY:
// Deinterlace Picture
if (deinterlace) {
throw new Exception("Cannot deinterlace: Functionality moved to FFmpegFrameFilter.");
}
// Has the size changed?
if (frame.imageWidth != picture_rgb.width() || frame.imageHeight != picture_rgb.height()) {
initPictureRGB();
}
// Convert the image into BGR or GRAY format that OpenCV uses
img_convert_ctx = sws_getCachedContext(img_convert_ctx, video_c.width(), video_c.height(),
video_c.pix_fmt(), frame.imageWidth, frame.imageHeight, getPixelFormat(),
imageScalingFlags != 0 ? imageScalingFlags : SWS_BILINEAR, null, null, (DoublePointer) null);
if (img_convert_ctx == null) {
throw new Exception("sws_getCachedContext() error: Cannot initialize the conversion context.");
}
// Convert the image from its native format to RGB or GRAY
sws_scale(img_convert_ctx, new PointerPointer(picture), picture.linesize(), 0, video_c.height(),
new PointerPointer(picture_rgb), picture_rgb.linesize());
frame.imageStride = picture_rgb.linesize(0);
frame.image = image_buf;
frame.opaque = picture_rgb;
break;
case RAW:
frame.imageStride = picture.linesize(0);
BytePointer ptr = picture.data(0);
if (ptr != null && !ptr.equals(image_ptr[0])) {
image_ptr[0] = ptr.capacity(frame.imageHeight * frame.imageStride);
image_buf[0] = ptr.asBuffer();
}
frame.image = image_buf;
frame.opaque = picture;
break;
default:
assert false;
}
frame.image[0].limit(frame.imageHeight * frame.imageStride);
frame.imageChannels = frame.imageStride / frame.imageWidth;
}
private void processSamples() throws Exception {
int ret;
int sample_format = samples_frame.format();
int planes = av_sample_fmt_is_planar(sample_format) != 0 ? (int) samples_frame.channels() : 1;
int data_size = av_samples_get_buffer_size((IntPointer) null, audio_c.channels(), samples_frame.nb_samples(),
audio_c.sample_fmt(), 1) / planes;
if (samples_buf == null || samples_buf.length != planes) {
samples_ptr = new BytePointer[planes];
samples_buf = new Buffer[planes];
}
frame.sampleRate = audio_c.sample_rate();
frame.audioChannels = audio_c.channels();
frame.samples = samples_buf;
frame.opaque = samples_frame;
int sample_size = data_size / av_get_bytes_per_sample(sample_format);
for (int i = 0; i < planes; i++) {
BytePointer p = samples_frame.data(i);
if (!p.equals(samples_ptr[i]) || samples_ptr[i].capacity() < data_size) {
samples_ptr[i] = p.capacity(data_size);
ByteBuffer b = p.asBuffer();
switch (sample_format) {
case AV_SAMPLE_FMT_U8:
case AV_SAMPLE_FMT_U8P:
samples_buf[i] = b;
break;
case AV_SAMPLE_FMT_S16:
case AV_SAMPLE_FMT_S16P:
samples_buf[i] = b.asShortBuffer();
break;
case AV_SAMPLE_FMT_S32:
case AV_SAMPLE_FMT_S32P:
samples_buf[i] = b.asIntBuffer();
break;
case AV_SAMPLE_FMT_FLT:
case AV_SAMPLE_FMT_FLTP:
samples_buf[i] = b.asFloatBuffer();
break;
case AV_SAMPLE_FMT_DBL:
case AV_SAMPLE_FMT_DBLP:
samples_buf[i] = b.asDoubleBuffer();
break;
default:
assert false;
}
}
samples_buf[i].position(0).limit(sample_size);
}
if (audio_c.channels() != getAudioChannels() || audio_c.sample_fmt() != getSampleFormat()
|| audio_c.sample_rate() != getSampleRate()) {
if (samples_convert_ctx == null || samples_channels != getAudioChannels()
|| samples_format != getSampleFormat() || samples_rate != getSampleRate()) {
samples_convert_ctx = swr_alloc_set_opts(samples_convert_ctx,
av_get_default_channel_layout(getAudioChannels()), getSampleFormat(), getSampleRate(),
av_get_default_channel_layout(audio_c.channels()), audio_c.sample_fmt(), audio_c.sample_rate(),
0, null);
if (samples_convert_ctx == null) {
throw new Exception("swr_alloc_set_opts() error: Cannot allocate the conversion context.");
} else if ((ret = swr_init(samples_convert_ctx)) < 0) {
throw new Exception("swr_init() error " + ret + ": Cannot initialize the conversion context.");
}
samples_channels = getAudioChannels();
samples_format = getSampleFormat();
samples_rate = getSampleRate();
}
int sample_size_in = samples_frame.nb_samples();
int planes_out = av_sample_fmt_is_planar(samples_format) != 0 ? (int) samples_frame.channels() : 1;
int sample_size_out = swr_get_out_samples(samples_convert_ctx, sample_size_in);
int sample_bytes_out = av_get_bytes_per_sample(samples_format);
int buffer_size_out = sample_size_out * sample_bytes_out * (planes_out > 1 ? 1 : samples_channels);
if (samples_buf_out == null || samples_buf.length != planes_out
|| samples_ptr_out[0].capacity() < buffer_size_out) {
for (int i = 0; samples_ptr_out != null && i < samples_ptr_out.length; i++) {
av_free(samples_ptr_out[i].position(0));
}
samples_ptr_out = new BytePointer[planes_out];
samples_buf_out = new Buffer[planes_out];
for (int i = 0; i < planes_out; i++) {
samples_ptr_out[i] = new BytePointer(av_malloc(buffer_size_out)).capacity(buffer_size_out);
ByteBuffer b = samples_ptr_out[i].asBuffer();
switch (samples_format) {
case AV_SAMPLE_FMT_U8:
case AV_SAMPLE_FMT_U8P:
samples_buf_out[i] = b;
break;
case AV_SAMPLE_FMT_S16:
case AV_SAMPLE_FMT_S16P:
samples_buf_out[i] = b.asShortBuffer();
break;
case AV_SAMPLE_FMT_S32:
case AV_SAMPLE_FMT_S32P:
samples_buf_out[i] = b.asIntBuffer();
break;
case AV_SAMPLE_FMT_FLT:
case AV_SAMPLE_FMT_FLTP:
samples_buf_out[i] = b.asFloatBuffer();
break;
case AV_SAMPLE_FMT_DBL:
case AV_SAMPLE_FMT_DBLP:
samples_buf_out[i] = b.asDoubleBuffer();
break;
default:
assert false;
}
}
}
frame.sampleRate = samples_rate;
frame.audioChannels = samples_channels;
frame.samples = samples_buf_out;
if ((ret = swr_convert(samples_convert_ctx, new PointerPointer(samples_ptr_out), sample_size_out,
new PointerPointer(samples_ptr), sample_size_in)) < 0) {
throw new Exception("swr_convert() error " + ret + ": Cannot convert audio samples.");
}
for (int i = 0; i < planes_out; i++) {
samples_ptr_out[i].position(0).limit(ret * (planes_out > 1 ? 1 : samples_channels));
samples_buf_out[i].position(0).limit(ret * (planes_out > 1 ? 1 : samples_channels));
}
}
}
public Frame grab() throws Exception {
return grabFrame(true, true, true, false);
}
public Frame grabImage() throws Exception {
return grabFrame(false, true, true, false);
}
public Frame grabSamples() throws Exception {
return grabFrame(true, false, true, false);
}
public Frame grabKeyFrame() throws Exception {
return grabFrame(false, true, true, true);
}
public Frame grabFrame(boolean doAudio, boolean doVideo, boolean doProcessing, boolean keyFrames) throws Exception {
if (oc == null || oc.isNull()) {
throw new Exception("Could not grab: No AVFormatContext. (Has start() been called?)");
} else if ((!doVideo || video_st == null) && (!doAudio || audio_st == null)) {
return null;
}
boolean videoFrameGrabbed = frameGrabbed && frame.image != null;
boolean audioFrameGrabbed = frameGrabbed && frame.samples != null;
frameGrabbed = false;
frame.keyFrame = false;
frame.imageWidth = 0;
frame.imageHeight = 0;
frame.imageDepth = 0;
frame.imageChannels = 0;
frame.imageStride = 0;
frame.image = null;
frame.sampleRate = 0;
frame.audioChannels = 0;
frame.samples = null;
frame.opaque = null;
if (doVideo && videoFrameGrabbed) {
if (doProcessing) {
processImage();
}
frame.keyFrame = picture.key_frame() != 0;
return frame;
} else if (doAudio && audioFrameGrabbed) {
if (doProcessing) {
processSamples();
}
frame.keyFrame = samples_frame.key_frame() != 0;
return frame;
}
boolean done = false;
while (!done) {
if (pkt2.size() <= 0) {
if (av_read_frame(oc, pkt) < 0) {
if (doVideo && video_st != null) {
// The video codec may have buffered some frames
pkt.stream_index(video_st.index());
pkt.flags(AV_PKT_FLAG_KEY);
pkt.data(null);
pkt.size(0);
} else {
return null;
}
}
}
// Is this a packet from the video stream?
if (doVideo && video_st != null && pkt.stream_index() == video_st.index()
&& (!keyFrames || pkt.flags() == AV_PKT_FLAG_KEY)) {
// Decode video frame
int len = avcodec_decode_video2(video_c, picture, got_frame, pkt);
// Did we get a video frame?
if (len >= 0 && got_frame[0] != 0 && (!keyFrames || picture.pict_type() == AV_PICTURE_TYPE_I)) {
long pts = av_frame_get_best_effort_timestamp(picture);
AVRational time_base = video_st.time_base();
timestamp = 1000000L * pts * time_base.num() / time_base.den();
// best guess, AVCodecContext.frame_number = number of decoded frames...
frameNumber = (int) Math.round(timestamp * getFrameRate() / 1000000L);
frame.image = image_buf;
if (doProcessing) {
processImage();
}
done = true;
frame.timestamp = timestamp;
frame.keyFrame = picture.key_frame() != 0;
} else if (pkt.data() == null && pkt.size() == 0) {
return null;
}
} else if (doAudio && audio_st != null && pkt.stream_index() == audio_st.index()) {
if (pkt2.size() <= 0) {
// HashMap is unacceptably slow on Android
// pkt2.put(pkt);
BytePointer.memcpy(pkt2, pkt, sizeof_pkt);
}
av_frame_unref(samples_frame);
// Decode audio frame
int len = avcodec_decode_audio4(audio_c, samples_frame, got_frame, pkt2);
if (len <= 0) {
// On error, trash the whole packet
pkt2.size(0);
} else {
pkt2.data(pkt2.data().position(len));
pkt2.size(pkt2.size() - len);
if (got_frame[0] != 0) {
long pts = av_frame_get_best_effort_timestamp(samples_frame);
AVRational time_base = audio_st.time_base();
timestamp = 1000000L * pts * time_base.num() / time_base.den();
frame.samples = samples_buf;
/* if a frame has been decoded, output it */
if (doProcessing) {
processSamples();
}
done = true;
frame.timestamp = timestamp;
frame.keyFrame = samples_frame.key_frame() != 0;
}
}
}
if (pkt2.size() <= 0) {
// Free the packet that was allocated by av_read_frame
av_packet_unref(pkt);
}
}
return frame;
}
public AVPacket grabPacket() throws Exception {
if (oc == null || oc.isNull()) {
throw new Exception("Could not trigger: No AVFormatContext. (Has start() been called?)");
}
// Return the next frame of a stream.
if (av_read_frame(oc, pkt) < 0) {
return null;
}
return pkt;
}
@Override
public void start() throws Exception {
}
}
... ...
server:
port: ${CAMERASERVER_SERVER_PORT:8083}
servlet:
context-path: /camera
config:
#直播流保活时间(分钟)
keepalive: ${CAMERASERVER_KEEPALIVE:1}
#nginx推送地址
push_host: ${CAMERASERVER_PUSH_HOST:127.0.0.1}
#额外推送地址
host_extra: ${CAMERASERVER_HOST_EXTRA:127.0.0.1}
#nginx推送端口
push_port: ${CAMERASERVER_PUSH_PORT:1935}
#主码流最大码率
main_code: ${CAMERASERVER_MAIN_CODE:5120}
#子码流最大码率
sub_code: ${CAMERASERVER_SUB_CODE:1024}
#编译版本信息
version: '@COMMIT_REV@.@BUILD_DATE@'
#logback
logging:
level:
com.junction: debug
#将日志输出到文件
config: classpath:camera-log.xml
... ...
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<property name="logDir" value="./logs" />
<!-- 控制台 appender -->
<appender name="STDOUT"
class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>[%d{yyyy-MM-dd HH:mm:ss.SSS}] [%thread] [%-5level] [%logger{50}] : %msg%n</pattern>
</encoder>
</appender>
<!-- 按照每天生成日志文件 -->
<appender name="info-file"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<rollingPolicy
class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!--日志文件输出的文件名 -->
<FileNamePattern>${logDir}/camera-info-%d{yyyy-MM-dd}.log</FileNamePattern>
<!--日志文件保留天数 -->
<MaxHistory>30</MaxHistory>
</rollingPolicy>
<encoder
class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<!--格式化输出:%d表示日期,%thread表示线程名,%-5level:级别从左显示5个字符宽度%msg:日志消息,%n是换行符 -->
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg%n</pattern>
</encoder>
<filter class="ch.qos.logback.classic.filter.LevelFilter"><!-- 只打印WARN日志 -->
<level>INFO</level>
<onMatch>ACCEPT</onMatch>
<onMismatch>DENY</onMismatch>
</filter>
<!--日志文件最大的大小 -->
<triggeringPolicy
class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<MaxFileSize>10MB</MaxFileSize>
</triggeringPolicy>
</appender>
<appender name="debug-file"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<rollingPolicy
class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!--日志文件输出的文件名 -->
<FileNamePattern>${logDir}/camera-debug-%d{yyyy-MM-dd}.log</FileNamePattern>
<!--日志文件保留天数 -->
<MaxHistory>30</MaxHistory>
</rollingPolicy>
<encoder
class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<!--格式化输出:%d表示日期,%thread表示线程名,%-5level:级别从左显示5个字符宽度%msg:日志消息,%n是换行符 -->
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg%n</pattern>
</encoder>
<filter class="ch.qos.logback.classic.filter.LevelFilter"><!-- 只打印WARN日志 -->
<level>DEBUG</level>
<onMatch>ACCEPT</onMatch>
<onMismatch>DENY</onMismatch>
</filter>
<!--日志文件最大的大小 -->
<triggeringPolicy
class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<MaxFileSize>10MB</MaxFileSize>
</triggeringPolicy>
</appender>
<appender name="error-file"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<rollingPolicy
class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!--日志文件输出的文件名 -->
<FileNamePattern>${logDir}/camera-error-%d{yyyy-MM-dd}.log</FileNamePattern>
<!--日志文件保留天数 -->
<MaxHistory>30</MaxHistory>
</rollingPolicy>
<encoder
class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<!--格式化输出:%d表示日期,%thread表示线程名,%-5level:级别从左显示5个字符宽度%msg:日志消息,%n是换行符 -->
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg%n</pattern>
</encoder>
<filter class="ch.qos.logback.classic.filter.LevelFilter"><!-- 只打印WARN日志 -->
<level>ERROR</level>
<onMatch>ACCEPT</onMatch>
<onMismatch>DENY</onMismatch>
</filter>
<!--日志文件最大的大小 -->
<triggeringPolicy
class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<MaxFileSize>10MB</MaxFileSize>
</triggeringPolicy>
</appender>
<logger name="com.junction" level="debug" />
<root level="info">
<appender-ref ref="STDOUT" />
<appender-ref ref="info-file" />
<appender-ref ref="debug-file" />
<appender-ref ref="error-file" />
</root>
</configuration>
... ...
<!DOCTYPE html>
<html lang="en">
<head>
<title>Video.js | HTML5 Video Player</title>
<link href="http://vjs.zencdn.net/5.20.1/video-js.css" rel="stylesheet">
<script src="http://vjs.zencdn.net/5.20.1/videojs-ie8.min.js"></script>
</head>
<body width="640px" height="360px">
<div style="margin-top: 100px; margin-left: 70px">
<h1>ʵʱԤ��</h1>
<video id="example_video_1" class="video-js vjs-default-skin" controls
preload="auto" width="352px" height="198px" data-setup="{}"
style="float: left">
<source src="rtmp://mower.kalman-navigation.com:1935/live/1a25e0e7-ca49-4d15-af57-8a858fc8a88a_1" type="rtmp/flv">
<p class="vjs-no-js">
To view this video please enable JavaScript, and consider upgrading
to a web browser that <a
href="http://videojs.com/html5-video-support/" target="_blank">supports
HTML5 video</a>
</p>
</video>
<!-- <video id="example_video_1" class="video-js vjs-default-skin" controls-->
<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
<!-- style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/live/stream33" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
<!-- <video id="example_video_2" class="video-js vjs-default-skin" controls-->
<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
<!-- style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/live/stream34" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
<!-- <video id="example_video_3" class="video-js vjs-default-skin" controls-->
<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
<!-- style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/live/stream35" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
<!-- <video id="example_video_4" class="video-js vjs-default-skin" controls-->
<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
<!-- style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/live/stream36" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
<!-- <video id="example_video_5" class="video-js vjs-default-skin" controls-->
<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
<!-- style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/live/stream37" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
<!-- </div>-->
<!-- <div style="margin-left: 70px">-->
<!-- <h1 style="clear: left; padding-top: 100px">��ʷ�ط�</h1>-->
<!-- <video id="example_video_6" class="video-js vjs-default-skin" controls-->
<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
<!-- style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/history/stream33" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
<!-- <video id="example_video_7" class="video-js vjs-default-skin" controls-->
<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
<!-- style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/history/stream34" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
<!-- <video id="example_video_8" class="video-js vjs-default-skin" controls-->
<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
<!-- style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/history/stream35" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
<!-- <video id="example_video_9" class="video-js vjs-default-skin"-->
<!-- controls="false" preload="auto" width="352px" height="198px"-->
<!-- data-setup="{}" style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/history/stream36" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
<!-- <video id="example_video10" class="video-js vjs-default-skin" controls-->
<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
<!-- style="float: left">-->
<!-- <source src="rtmp://127.0.0.1:1935/history/stream36" type="rtmp/flv">-->
<!-- <p class="vjs-no-js">-->
<!-- To view this video please enable JavaScript, and consider upgrading-->
<!-- to a web browser that <a-->
<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
<!-- HTML5 video</a>-->
<!-- </p>-->
<!-- </video>-->
</div>
<script src="http://vjs.zencdn.net/5.20.1/video.js"></script>
</body>
</html>
... ...
package com.junction;
import org.junit.jupiter.api.Test;
import org.springframework.boot.test.context.SpringBootTest;
@SpringBootTest
class CameraServerApplicationTests {
@Test
void contextLoads() {
}
}
... ...