作者 钟来

初始提交

  1 + GNU GENERAL PUBLIC LICENSE
  2 + Version 2, June 1991
  3 +
  4 + Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
  5 + 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
  6 + Everyone is permitted to copy and distribute verbatim copies
  7 + of this license document, but changing it is not allowed.
  8 +
  9 + Preamble
  10 +
  11 + The licenses for most software are designed to take away your
  12 +freedom to share and change it. By contrast, the GNU General Public
  13 +License is intended to guarantee your freedom to share and change free
  14 +software--to make sure the software is free for all its users. This
  15 +General Public License applies to most of the Free Software
  16 +Foundation's software and to any other program whose authors commit to
  17 +using it. (Some other Free Software Foundation software is covered by
  18 +the GNU Lesser General Public License instead.) You can apply it to
  19 +your programs, too.
  20 +
  21 + When we speak of free software, we are referring to freedom, not
  22 +price. Our General Public Licenses are designed to make sure that you
  23 +have the freedom to distribute copies of free software (and charge for
  24 +this service if you wish), that you receive source code or can get it
  25 +if you want it, that you can change the software or use pieces of it
  26 +in new free programs; and that you know you can do these things.
  27 +
  28 + To protect your rights, we need to make restrictions that forbid
  29 +anyone to deny you these rights or to ask you to surrender the rights.
  30 +These restrictions translate to certain responsibilities for you if you
  31 +distribute copies of the software, or if you modify it.
  32 +
  33 + For example, if you distribute copies of such a program, whether
  34 +gratis or for a fee, you must give the recipients all the rights that
  35 +you have. You must make sure that they, too, receive or can get the
  36 +source code. And you must show them these terms so they know their
  37 +rights.
  38 +
  39 + We protect your rights with two steps: (1) copyright the software, and
  40 +(2) offer you this license which gives you legal permission to copy,
  41 +distribute and/or modify the software.
  42 +
  43 + Also, for each author's protection and ours, we want to make certain
  44 +that everyone understands that there is no warranty for this free
  45 +software. If the software is modified by someone else and passed on, we
  46 +want its recipients to know that what they have is not the original, so
  47 +that any problems introduced by others will not reflect on the original
  48 +authors' reputations.
  49 +
  50 + Finally, any free program is threatened constantly by software
  51 +patents. We wish to avoid the danger that redistributors of a free
  52 +program will individually obtain patent licenses, in effect making the
  53 +program proprietary. To prevent this, we have made it clear that any
  54 +patent must be licensed for everyone's free use or not licensed at all.
  55 +
  56 + The precise terms and conditions for copying, distribution and
  57 +modification follow.
  58 +
  59 + GNU GENERAL PUBLIC LICENSE
  60 + TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
  61 +
  62 + 0. This License applies to any program or other work which contains
  63 +a notice placed by the copyright holder saying it may be distributed
  64 +under the terms of this General Public License. The "Program", below,
  65 +refers to any such program or work, and a "work based on the Program"
  66 +means either the Program or any derivative work under copyright law:
  67 +that is to say, a work containing the Program or a portion of it,
  68 +either verbatim or with modifications and/or translated into another
  69 +language. (Hereinafter, translation is included without limitation in
  70 +the term "modification".) Each licensee is addressed as "you".
  71 +
  72 +Activities other than copying, distribution and modification are not
  73 +covered by this License; they are outside its scope. The act of
  74 +running the Program is not restricted, and the output from the Program
  75 +is covered only if its contents constitute a work based on the
  76 +Program (independent of having been made by running the Program).
  77 +Whether that is true depends on what the Program does.
  78 +
  79 + 1. You may copy and distribute verbatim copies of the Program's
  80 +source code as you receive it, in any medium, provided that you
  81 +conspicuously and appropriately publish on each copy an appropriate
  82 +copyright notice and disclaimer of warranty; keep intact all the
  83 +notices that refer to this License and to the absence of any warranty;
  84 +and give any other recipients of the Program a copy of this License
  85 +along with the Program.
  86 +
  87 +You may charge a fee for the physical act of transferring a copy, and
  88 +you may at your option offer warranty protection in exchange for a fee.
  89 +
  90 + 2. You may modify your copy or copies of the Program or any portion
  91 +of it, thus forming a work based on the Program, and copy and
  92 +distribute such modifications or work under the terms of Section 1
  93 +above, provided that you also meet all of these conditions:
  94 +
  95 + a) You must cause the modified files to carry prominent notices
  96 + stating that you changed the files and the date of any change.
  97 +
  98 + b) You must cause any work that you distribute or publish, that in
  99 + whole or in part contains or is derived from the Program or any
  100 + part thereof, to be licensed as a whole at no charge to all third
  101 + parties under the terms of this License.
  102 +
  103 + c) If the modified program normally reads commands interactively
  104 + when run, you must cause it, when started running for such
  105 + interactive use in the most ordinary way, to print or display an
  106 + announcement including an appropriate copyright notice and a
  107 + notice that there is no warranty (or else, saying that you provide
  108 + a warranty) and that users may redistribute the program under
  109 + these conditions, and telling the user how to view a copy of this
  110 + License. (Exception: if the Program itself is interactive but
  111 + does not normally print such an announcement, your work based on
  112 + the Program is not required to print an announcement.)
  113 +
  114 +These requirements apply to the modified work as a whole. If
  115 +identifiable sections of that work are not derived from the Program,
  116 +and can be reasonably considered independent and separate works in
  117 +themselves, then this License, and its terms, do not apply to those
  118 +sections when you distribute them as separate works. But when you
  119 +distribute the same sections as part of a whole which is a work based
  120 +on the Program, the distribution of the whole must be on the terms of
  121 +this License, whose permissions for other licensees extend to the
  122 +entire whole, and thus to each and every part regardless of who wrote it.
  123 +
  124 +Thus, it is not the intent of this section to claim rights or contest
  125 +your rights to work written entirely by you; rather, the intent is to
  126 +exercise the right to control the distribution of derivative or
  127 +collective works based on the Program.
  128 +
  129 +In addition, mere aggregation of another work not based on the Program
  130 +with the Program (or with a work based on the Program) on a volume of
  131 +a storage or distribution medium does not bring the other work under
  132 +the scope of this License.
  133 +
  134 + 3. You may copy and distribute the Program (or a work based on it,
  135 +under Section 2) in object code or executable form under the terms of
  136 +Sections 1 and 2 above provided that you also do one of the following:
  137 +
  138 + a) Accompany it with the complete corresponding machine-readable
  139 + source code, which must be distributed under the terms of Sections
  140 + 1 and 2 above on a medium customarily used for software interchange; or,
  141 +
  142 + b) Accompany it with a written offer, valid for at least three
  143 + years, to give any third party, for a charge no more than your
  144 + cost of physically performing source distribution, a complete
  145 + machine-readable copy of the corresponding source code, to be
  146 + distributed under the terms of Sections 1 and 2 above on a medium
  147 + customarily used for software interchange; or,
  148 +
  149 + c) Accompany it with the information you received as to the offer
  150 + to distribute corresponding source code. (This alternative is
  151 + allowed only for noncommercial distribution and only if you
  152 + received the program in object code or executable form with such
  153 + an offer, in accord with Subsection b above.)
  154 +
  155 +The source code for a work means the preferred form of the work for
  156 +making modifications to it. For an executable work, complete source
  157 +code means all the source code for all modules it contains, plus any
  158 +associated interface definition files, plus the scripts used to
  159 +control compilation and installation of the executable. However, as a
  160 +special exception, the source code distributed need not include
  161 +anything that is normally distributed (in either source or binary
  162 +form) with the major components (compiler, kernel, and so on) of the
  163 +operating system on which the executable runs, unless that component
  164 +itself accompanies the executable.
  165 +
  166 +If distribution of executable or object code is made by offering
  167 +access to copy from a designated place, then offering equivalent
  168 +access to copy the source code from the same place counts as
  169 +distribution of the source code, even though third parties are not
  170 +compelled to copy the source along with the object code.
  171 +
  172 + 4. You may not copy, modify, sublicense, or distribute the Program
  173 +except as expressly provided under this License. Any attempt
  174 +otherwise to copy, modify, sublicense or distribute the Program is
  175 +void, and will automatically terminate your rights under this License.
  176 +However, parties who have received copies, or rights, from you under
  177 +this License will not have their licenses terminated so long as such
  178 +parties remain in full compliance.
  179 +
  180 + 5. You are not required to accept this License, since you have not
  181 +signed it. However, nothing else grants you permission to modify or
  182 +distribute the Program or its derivative works. These actions are
  183 +prohibited by law if you do not accept this License. Therefore, by
  184 +modifying or distributing the Program (or any work based on the
  185 +Program), you indicate your acceptance of this License to do so, and
  186 +all its terms and conditions for copying, distributing or modifying
  187 +the Program or works based on it.
  188 +
  189 + 6. Each time you redistribute the Program (or any work based on the
  190 +Program), the recipient automatically receives a license from the
  191 +original licensor to copy, distribute or modify the Program subject to
  192 +these terms and conditions. You may not impose any further
  193 +restrictions on the recipients' exercise of the rights granted herein.
  194 +You are not responsible for enforcing compliance by third parties to
  195 +this License.
  196 +
  197 + 7. If, as a consequence of a court judgment or allegation of patent
  198 +infringement or for any other reason (not limited to patent issues),
  199 +conditions are imposed on you (whether by court order, agreement or
  200 +otherwise) that contradict the conditions of this License, they do not
  201 +excuse you from the conditions of this License. If you cannot
  202 +distribute so as to satisfy simultaneously your obligations under this
  203 +License and any other pertinent obligations, then as a consequence you
  204 +may not distribute the Program at all. For example, if a patent
  205 +license would not permit royalty-free redistribution of the Program by
  206 +all those who receive copies directly or indirectly through you, then
  207 +the only way you could satisfy both it and this License would be to
  208 +refrain entirely from distribution of the Program.
  209 +
  210 +If any portion of this section is held invalid or unenforceable under
  211 +any particular circumstance, the balance of the section is intended to
  212 +apply and the section as a whole is intended to apply in other
  213 +circumstances.
  214 +
  215 +It is not the purpose of this section to induce you to infringe any
  216 +patents or other property right claims or to contest validity of any
  217 +such claims; this section has the sole purpose of protecting the
  218 +integrity of the free software distribution system, which is
  219 +implemented by public license practices. Many people have made
  220 +generous contributions to the wide range of software distributed
  221 +through that system in reliance on consistent application of that
  222 +system; it is up to the author/donor to decide if he or she is willing
  223 +to distribute software through any other system and a licensee cannot
  224 +impose that choice.
  225 +
  226 +This section is intended to make thoroughly clear what is believed to
  227 +be a consequence of the rest of this License.
  228 +
  229 + 8. If the distribution and/or use of the Program is restricted in
  230 +certain countries either by patents or by copyrighted interfaces, the
  231 +original copyright holder who places the Program under this License
  232 +may add an explicit geographical distribution limitation excluding
  233 +those countries, so that distribution is permitted only in or among
  234 +countries not thus excluded. In such case, this License incorporates
  235 +the limitation as if written in the body of this License.
  236 +
  237 + 9. The Free Software Foundation may publish revised and/or new versions
  238 +of the General Public License from time to time. Such new versions will
  239 +be similar in spirit to the present version, but may differ in detail to
  240 +address new problems or concerns.
  241 +
  242 +Each version is given a distinguishing version number. If the Program
  243 +specifies a version number of this License which applies to it and "any
  244 +later version", you have the option of following the terms and conditions
  245 +either of that version or of any later version published by the Free
  246 +Software Foundation. If the Program does not specify a version number of
  247 +this License, you may choose any version ever published by the Free Software
  248 +Foundation.
  249 +
  250 + 10. If you wish to incorporate parts of the Program into other free
  251 +programs whose distribution conditions are different, write to the author
  252 +to ask for permission. For software which is copyrighted by the Free
  253 +Software Foundation, write to the Free Software Foundation; we sometimes
  254 +make exceptions for this. Our decision will be guided by the two goals
  255 +of preserving the free status of all derivatives of our free software and
  256 +of promoting the sharing and reuse of software generally.
  257 +
  258 + NO WARRANTY
  259 +
  260 + 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
  261 +FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
  262 +OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
  263 +PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
  264 +OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
  265 +MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
  266 +TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
  267 +PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
  268 +REPAIR OR CORRECTION.
  269 +
  270 + 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
  271 +WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
  272 +REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
  273 +INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
  274 +OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
  275 +TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
  276 +YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
  277 +PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
  278 +POSSIBILITY OF SUCH DAMAGES.
  279 +
  280 + END OF TERMS AND CONDITIONS
  281 +
  282 + How to Apply These Terms to Your New Programs
  283 +
  284 + If you develop a new program, and you want it to be of the greatest
  285 +possible use to the public, the best way to achieve this is to make it
  286 +free software which everyone can redistribute and change under these terms.
  287 +
  288 + To do so, attach the following notices to the program. It is safest
  289 +to attach them to the start of each source file to most effectively
  290 +convey the exclusion of warranty; and each file should have at least
  291 +the "copyright" line and a pointer to where the full notice is found.
  292 +
  293 + <one line to give the program's name and a brief idea of what it does.>
  294 + Copyright (C) <year> <name of author>
  295 +
  296 + This program is free software; you can redistribute it and/or modify
  297 + it under the terms of the GNU General Public License as published by
  298 + the Free Software Foundation; either version 2 of the License, or
  299 + (at your option) any later version.
  300 +
  301 + This program is distributed in the hope that it will be useful,
  302 + but WITHOUT ANY WARRANTY; without even the implied warranty of
  303 + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
  304 + GNU General Public License for more details.
  305 +
  306 + You should have received a copy of the GNU General Public License along
  307 + with this program; if not, write to the Free Software Foundation, Inc.,
  308 + 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
  309 +
  310 +Also add information on how to contact you by electronic and paper mail.
  311 +
  312 +If the program is interactive, make it output a short notice like this
  313 +when it starts in an interactive mode:
  314 +
  315 + Gnomovision version 69, Copyright (C) year name of author
  316 + Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
  317 + This is free software, and you are welcome to redistribute it
  318 + under certain conditions; type `show c' for details.
  319 +
  320 +The hypothetical commands `show w' and `show c' should show the appropriate
  321 +parts of the General Public License. Of course, the commands you use may
  322 +be called something other than `show w' and `show c'; they could even be
  323 +mouse-clicks or menu items--whatever suits your program.
  324 +
  325 +You should also get your employer (if you work as a programmer) or your
  326 +school, if any, to sign a "copyright disclaimer" for the program, if
  327 +necessary. Here is a sample; alter the names:
  328 +
  329 + Yoyodyne, Inc., hereby disclaims all copyright interest in the program
  330 + `Gnomovision' (which makes passes at compilers) written by James Hacker.
  331 +
  332 + <signature of Ty Coon>, 1 April 1989
  333 + Ty Coon, President of Vice
  334 +
  335 +This General Public License does not permit incorporating your program into
  336 +proprietary programs. If your program is a subroutine library, you may
  337 +consider it more useful to permit linking proprietary applications with the
  338 +library. If this is what you want to do, use the GNU Lesser General
  339 +Public License instead of this License.
  1 +## RTSPtoHTTP-FLV 使用JavaCV开发的rtsp流转http-flv流(rtmp已不推荐)并进行推流的流媒体服务
  2 +
  3 +**求star!!!**
  4 +
  5 +#### 提问求助等优先提交issues,让其他遇到同样问题的朋友可以很方便找到解决方式,尽量避免直接加微信qq咨询。业务合作可发邮件到banmajio@163.com或添加微信qq咨询。
  6 +
  7 +### 各大浏览器目前均已不再支持flash,故推荐使用http-flv来代替rtmp使用。
  8 +>[参考资料](https://blog.csdn.net/weixin_40777510/article/details/106693408)
  9 +>只需修改本项目controller中rtmp地址生成的地方改为生成http-flv地址即可,各流媒体服务器对于http-flv地址规则可能会有差异,根据所选流媒体服务器来制定http-flv地址。
  10 +
  11 +>**个人博客:[banmajio's blog](https://www.banmajio.com/)**
  12 +>**csdn博客:[banmajio's csdn](https://blog.csdn.net/weixin_40777510)**
  13 +>**gitee地址:[RTSPtoRTMP](https://gitee.com/banmajio/RTSPtoRTMP)**
  14 +
  15 +### 可以实现各h264编码的监控设备rtsp流转rtmp流(只需要改动controller中rtsp指令的拼接格式)
  16 +
  17 +**接口调用方式:[接口文档](https://github.com/banmajio/RTSPtoRTMP/wiki/%E6%8E%A5%E5%8F%A3%E6%96%87%E6%A1%A3)**
  18 +
  19 +#### [注]:
  20 +该项目中的一些处理是为了满足公司项目需求添加完善的,如果需要改造扩展只需要在原来的基础上进行扩充或者剥离即可。最基本的核心操作在CameraPush.java这个类中。
  21 +
  22 +#### 该项目需要搭配使用的nginx服务器下载地址:
  23 +[http://cdn.banmajio.com/nginx.rar](http://cdn.banmajio.com/nginx.rar)
  24 +下载后解压该文件,点击nginx.exe(闪退是正常的,可以通过任务管理器查看是否存在nginx进程,存在则说明启动成功了)启动nginx服务。nginx的配置文件存放在conf目录下的nginx.conf,根据需要修改。项目中的rtmp地址就是根据这个配置文件来的。
  25 +
  26 +### 存在的问题:
  27 +1.部分设备或NVR在进行历史回放时,会出现带宽不足的报错,暂不清楚造成该情况的具体原因。如果出现rtsp地址带时间戳参数进行历史回放出现报错或者无法播放的情况,请考虑使用厂家提供的sdk进行二次开发,捕获码流数据自行处理推成rtmp流。
  28 +>**出现此问题的原因参考:**[使用rtsp带starttime和endtime进行历史回放报453 Not Enough Bandwidth(带宽不足)](https://blog.csdn.net/weixin_40777510/article/details/106802234)
  29 +
  30 +2.对于上述历史回放的问题,现在已经通过对接海康的sdk进行二次开发,通过sdk回调的码流数据自行处理推到rtmp。
  31 +>**实现思路参考:**[海康sdk捕获码流数据通过JavaCV推成rtmp流的实现思路(PS流转封装RTMP)](https://blog.csdn.net/weixin_40777510/article/details/105840823)
  32 +
  33 +>**项目搭建过程请参考本人博文:[FFmpeg转封装rtsp到rtmp(无需转码,低资源消耗)](https://www.banmajio.com/post/638986b0.html#more)**
  34 +
  35 +>**开发过程的遇到的一些问题和解决方法,会发布到csdn博客中,[banmajio csdn](https://blog.csdn.net/weixin_40777510)**
  36 +
  37 +**感谢[nn200433](https://github.com/nn200433)小伙伴对本项目的支持,详细改动请参考rp分支内的提交内容**
  38 +
  39 +### 碎银打赏,以资奖励
  40 +<img src="https://images.gitee.com/uploads/images/2020/0421/174552_a862b4ed_5186477.jpeg" width="200px" />
  41 +
  42 +<img src="https://images.gitee.com/uploads/images/2020/0421/174726_cb99c1d6_5186477.jpeg" width="200px" />
  1 +
  2 + <h1 class="curproject-name"> camera-rtmp </h1>
  3 +
  4 +
  5 +
  6 +# 公共分类
  7 +
  8 +## 获取服务信息
  9 +<a id=获取服务信息> </a>
  10 +### 基本信息
  11 +
  12 +**Path:** /status
  13 +
  14 +**Method:** GET
  15 +
  16 +**接口描述:**
  17 +<p>获取当前服务运行时长以及保活时长、推送IP、推送端口、设备主码流最大码率、设备子码流最大码率的信息。</p>
  18 +
  19 +
  20 +### 请求参数
  21 +
  22 +### 返回数据
  23 +
  24 +<table>
  25 + <thead class="ant-table-thead">
  26 + <tr>
  27 + <th key=name>名称</th><th key=type>类型</th><th key=required>是否必须</th><th key=default>默认值</th><th key=desc>备注</th><th key=sub>其他信息</th>
  28 + </tr>
  29 + </thead><tbody className="ant-table-tbody"><tr key=0-0><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> uptime</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">运行时长</span></td><td key=5></td></tr><tr key=0-1><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> config</span></td><td key=1><span>object</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">配置参数</span></td><td key=5></td></tr><tr key=0-1-0><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> keepalive</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">保活时长(分钟)</span></td><td key=5></td></tr><tr key=0-1-1><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> push_host</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">推送IP</span></td><td key=5></td></tr><tr key=0-1-2><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> host_extra</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap"></span></td><td key=5></td></tr><tr key=0-1-3><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> push_port</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">推送端口</span></td><td key=5></td></tr><tr key=0-1-4><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> main_code</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备主码流最大码率</span></td><td key=5></td></tr><tr key=0-1-5><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> sub_code</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备子码流最大码率</span></td><td key=5></td></tr>
  30 + </tbody>
  31 + </table>
  32 +
  33 +## 获取视频流
  34 +<a id=获取视频流> </a>
  35 +### 基本信息
  36 +
  37 +**Path:** /cameras
  38 +
  39 +**Method:** GET
  40 +
  41 +**接口描述:**
  42 +<p>获取当前正在进行推流的设备信息。</p>
  43 +
  44 +
  45 +### 请求参数
  46 +
  47 +### 返回数据
  48 +
  49 +<table>
  50 + <thead class="ant-table-thead">
  51 + <tr>
  52 + <th key=name>名称</th><th key=type>类型</th><th key=required>是否必须</th><th key=default>默认值</th><th key=desc>备注</th><th key=sub>其他信息</th>
  53 + </tr>
  54 + </thead><tbody className="ant-table-tbody"><tr key=0><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> </span></td><td key=1><span>object []</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap"></span></td><td key=5><p key=3><span style="font-weight: '700'">item 类型: </span><span>object</span></p></td></tr><tr key=0-0><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> ip</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备用户名</span></td><td key=5></td></tr><tr key=0-1><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> username</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备密码</span></td><td key=5></td></tr><tr key=0-2><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> password</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备ip</span></td><td key=5></td></tr><tr key=0-3><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> channel</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">通道号</span></td><td key=5></td></tr><tr key=0-4><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> stream</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">码流(历史流不返回码流)</span></td><td key=5></td></tr><tr key=0-5><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> rtsp</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">取流地址</span></td><td key=5></td></tr><tr key=0-6><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> rtmp</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">推流地址</span></td><td key=5></td></tr><tr key=0-7><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> url</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">播放地址</span></td><td key=5></td></tr><tr key=0-8><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> startTime</span></td><td key=1><span>string</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">开始时间(直播流没有开始时间)</span></td><td key=5></td></tr><tr key=0-9><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> endTime</span></td><td key=1><span>string</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">结束时间(直播流没有结束时间)</span></td><td key=5></td></tr><tr key=0-10><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> openTime</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">打开时间</span></td><td key=5></td></tr><tr key=0-11><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> count</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">使用人数</span></td><td key=5></td></tr><tr key=0-12><td key=0><span style="padding-left: 20px"><span style="color: #8c8a8a">├─</span> token</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">令牌</span></td><td key=5></td></tr>
  55 + </tbody>
  56 + </table>
  57 +
  58 +## 开启视频流
  59 +<a id=开启视频流> </a>
  60 +### 基本信息
  61 +
  62 +**Path:** /cameras
  63 +
  64 +**Method:** POST
  65 +
  66 +**接口描述:**
  67 +<p>通过传入参数将rtsp流转为rtmp流进行推送。(历史流推送时,如果该设备正在推流则返回“当前视频正在使用中...”)</p>
  68 +
  69 +
  70 +### 请求参数
  71 +**Headers**
  72 +
  73 +| 参数名称 | 参数值 | 是否必须 | 示例 | 备注 |
  74 +| ------------ | ------------ | ------------ | ------------ | ------------ |
  75 +| Content-Type | application/json | 是 | | |
  76 +**Body**
  77 +
  78 +<table>
  79 + <thead class="ant-table-thead">
  80 + <tr>
  81 + <th key=name>名称</th><th key=type>类型</th><th key=required>是否必须</th><th key=default>默认值</th><th key=desc>备注</th><th key=sub>其他信息</th>
  82 + </tr>
  83 + </thead><tbody className="ant-table-tbody"><tr key=0-0><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> ip</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备IP</span></td><td key=5></td></tr><tr key=0-1><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> username</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备用户名</span></td><td key=5></td></tr><tr key=0-2><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> password</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">设备密码</span></td><td key=5></td></tr><tr key=0-3><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> channel</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">通道号</span></td><td key=5></td></tr><tr key=0-4><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> stream</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">码流(直播流需要指定码流;历史流不需要指定码流)</span></td><td key=5></td></tr><tr key=0-5><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> startTime</span></td><td key=1><span>string</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">开始时间(直播流没有开始时间)</span></td><td key=5></td></tr><tr key=0-6><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> endTime</span></td><td key=1><span>string</span></td><td key=2>非必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">结束时间(直播流没有结束时间)</span></td><td key=5></td></tr>
  84 + </tbody>
  85 + </table>
  86 +
  87 +### 返回数据
  88 +
  89 +<table>
  90 + <thead class="ant-table-thead">
  91 + <tr>
  92 + <th key=name>名称</th><th key=type>类型</th><th key=required>是否必须</th><th key=default>默认值</th><th key=desc>备注</th><th key=sub>其他信息</th>
  93 + </tr>
  94 + </thead><tbody className="ant-table-tbody"><tr key=0-0><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> token</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">令牌</span></td><td key=5></td></tr><tr key=0-1><td key=0><span style="padding-left: 0px"><span style="color: #8c8a8a"></span> uri</span></td><td key=1><span>string</span></td><td key=2>必须</td><td key=3></td><td key=4><span style="white-space: pre-wrap">推流地址</span></td><td key=5></td></tr>
  95 + </tbody>
  96 + </table>
  97 +
  98 +## 关闭视频流
  99 +<a id=关闭视频流> </a>
  100 +### 基本信息
  101 +
  102 +**Path:** /cameras/:tokens
  103 +
  104 +**Method:** DELETE
  105 +
  106 +**接口描述:**
  107 +<p>关闭正在进行的推流任务。</p>
  108 +
  109 +
  110 +### 请求参数
  111 +**路径参数**
  112 +
  113 +| 参数名称 | 示例 | 备注 |
  114 +| ------------ | ------------ | ------------ |
  115 +| tokens | | 令牌 |
  116 +
  117 +## 视频流保活
  118 +<a id=视频流保活> </a>
  119 +### 基本信息
  120 +
  121 +**Path:** /cameras/:tokens
  122 +
  123 +**Method:** PUT
  124 +
  125 +**接口描述:**
  126 +<p>对正在推送的视频流进行保活。</p>
  127 +
  128 +
  129 +### 请求参数
  130 +**路径参数**
  131 +
  132 +| 参数名称 | 示例 | 备注 |
  133 +| ------------ | ------------ | ------------ |
  134 +| tokens | | 令牌 |
  1 +<?xml version="1.0" encoding="UTF-8"?>
  2 +<project xmlns="http://maven.apache.org/POM/4.0.0"
  3 + xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  4 + xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
  5 + <modelVersion>4.0.0</modelVersion>
  6 + <parent>
  7 + <groupId>org.springframework.boot</groupId>
  8 + <artifactId>spring-boot-starter-parent</artifactId>
  9 + <version>2.2.2.RELEASE</version>
  10 + <relativePath /> <!-- lookup parent from repository -->
  11 + </parent>
  12 + <groupId>com.junction</groupId>
  13 + <artifactId>CameraServer</artifactId>
  14 + <version>${COMMIT_REV}.${BUILD_DATE}</version>
  15 + <name>CameraServer</name>
  16 + <description>Demo project for Spring Boot</description>
  17 +
  18 + <properties>
  19 + <java.version>1.8</java.version>
  20 + <version>${COMMIT_REV}.${BUILD_DATE}</version>
  21 + </properties>
  22 +
  23 + <dependencies>
  24 + <dependency>
  25 + <groupId>org.springframework.boot</groupId>
  26 + <artifactId>spring-boot-starter-web</artifactId>
  27 + </dependency>
  28 +
  29 + <dependency>
  30 + <groupId>org.springframework.boot</groupId>
  31 + <artifactId>spring-boot-starter-test</artifactId>
  32 + <scope>test</scope>
  33 + <exclusions>
  34 + <exclusion>
  35 + <groupId>org.junit.vintage</groupId>
  36 + <artifactId>junit-vintage-engine</artifactId>
  37 + </exclusion>
  38 + </exclusions>
  39 + </dependency>
  40 + <!-- <dependency> -->
  41 + <!-- <groupId>org.bytedeco</groupId> -->
  42 + <!-- <artifactId>javacv</artifactId> -->
  43 + <!-- <version>1.5.2</version> -->
  44 + <!-- </dependency> -->
  45 + <!-- <dependency> -->
  46 + <!-- <groupId>org.bytedeco</groupId> -->
  47 + <!-- <artifactId>ffmpeg-platform</artifactId> -->
  48 + <!-- <version>4.2.1-1.5.2</version> -->
  49 + <!-- </dependency> -->
  50 + <dependency>
  51 + <groupId>org.bytedeco</groupId>
  52 + <artifactId>javacv</artifactId>
  53 + <version>1.5.3</version>
  54 + </dependency>
  55 +
  56 + <dependency>
  57 + <groupId>org.bytedeco</groupId>
  58 + <artifactId>ffmpeg-platform</artifactId>
  59 + <version>4.2.2-1.5.3</version>
  60 + </dependency>
  61 + <!-- 支持 @ConfigurationProperties 注解 -->
  62 + <dependency>
  63 + <groupId>org.springframework.boot</groupId>
  64 + <artifactId>spring-boot-configuration-processor</artifactId>
  65 + <optional>true</optional>
  66 + </dependency>
  67 + <dependency>
  68 + <groupId>com.alibaba</groupId>
  69 + <artifactId>fastjson</artifactId>
  70 + <version>1.2.68</version>
  71 + </dependency>
  72 + </dependencies>
  73 +
  74 + <build>
  75 + <finalName>rtsp_rtmp</finalName><!-- 指定package生成的文件名为my-spring-boot.jar -->
  76 + <plugins>
  77 + <plugin>
  78 + <groupId>org.springframework.boot</groupId>
  79 + <artifactId>spring-boot-maven-plugin</artifactId>
  80 + </plugin>
  81 + </plugins>
  82 + </build>
  83 +
  84 +</project>
  1 +package com.junction;
  2 +
  3 +import java.util.Date;
  4 +import java.util.Set;
  5 +
  6 +import javax.annotation.PreDestroy;
  7 +
  8 +import org.bytedeco.javacv.FFmpegFrameGrabber;
  9 +import org.bytedeco.javacv.FFmpegFrameRecorder;
  10 +import org.slf4j.Logger;
  11 +import org.slf4j.LoggerFactory;
  12 +import org.springframework.boot.SpringApplication;
  13 +import org.springframework.boot.autoconfigure.SpringBootApplication;
  14 +import org.springframework.context.ApplicationContext;
  15 +
  16 +import com.junction.cache.CacheUtil;
  17 +import com.junction.controller.CameraController;
  18 +import com.junction.push.CameraPush;
  19 +import com.junction.thread.CameraThread;
  20 +import com.junction.timer.CameraTimer;
  21 +
  22 +@SpringBootApplication
  23 +public class CameraServerApplication {
  24 +
  25 + private final static Logger logger = LoggerFactory.getLogger(CameraServerApplication.class);
  26 +
  27 + public static void main(String[] args) {
  28 + // 服务启动执行FFmpegFrameGrabber和FFmpegFrameRecorder的tryLoad(),以免导致第一次推流时耗时。
  29 + try {
  30 + FFmpegFrameGrabber.tryLoad();
  31 + FFmpegFrameRecorder.tryLoad();
  32 + } catch (org.bytedeco.javacv.FrameRecorder.Exception e) {
  33 + e.printStackTrace();
  34 + } catch (Exception e) {
  35 + e.printStackTrace();
  36 + }
  37 + // 将服务启动时间存入缓存
  38 + CacheUtil.STARTTIME = new Date().getTime();
  39 + final ApplicationContext applicationContext = SpringApplication.run(CameraServerApplication.class, args);
  40 + // 将上下文传入RealPlay类中,以使其使用config中的变量
  41 + CameraPush.setApplicationContext(applicationContext);
  42 + }
  43 +
  44 + @PreDestroy
  45 + public void destory() {
  46 + logger.info("服务结束,开始释放空间...");
  47 + // 结束正在进行的任务
  48 + Set<String> keys = CameraController.JOBMAP.keySet();
  49 + for (String key : keys) {
  50 + CameraController.JOBMAP.get(key).setInterrupted(key);
  51 + }
  52 + // 关闭线程池
  53 + CameraThread.MyRunnable.es.shutdown();
  54 + // 销毁定时器
  55 + CameraTimer.timer.cancel();
  56 + }
  57 +}
  1 +package com.junction;
  2 +
  3 +import org.springframework.boot.builder.SpringApplicationBuilder;
  4 +import org.springframework.boot.web.servlet.support.SpringBootServletInitializer;
  5 +
  6 +public class ServletInitializer extends SpringBootServletInitializer {
  7 +
  8 + @Override
  9 + protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
  10 + return application.sources(CameraServerApplication.class);
  11 + }
  12 +}
  1 +package com.junction.cache;
  2 +
  3 +import java.util.Map;
  4 +import java.util.concurrent.ConcurrentHashMap;
  5 +
  6 +import com.junction.pojo.CameraPojo;
  7 +import com.junction.push.CameraPush;
  8 +
  9 +/**
  10 + * @Title CacheUtil.java
  11 + * @description 推流缓存信息
  12 + * @time 2019年12月17日 下午3:12:45
  13 + * @author wuguodong
  14 + **/
  15 +public final class CacheUtil {
  16 + /*
  17 + * 保存已经开始推的流
  18 + */
  19 + public static Map<String, CameraPojo> STREATMAP = new ConcurrentHashMap<String, CameraPojo>();
  20 +
  21 + /*
  22 + * 保存push
  23 + */
  24 + public static Map<String, CameraPush> PUSHMAP = new ConcurrentHashMap<>();
  25 + /*
  26 + * 保存服务启动时间
  27 + */
  28 + public static long STARTTIME;
  29 +
  30 +}
  1 +package com.junction.controller;
  2 +
  3 +import java.io.IOException;
  4 +import java.net.InetSocketAddress;
  5 +import java.net.Socket;
  6 +import java.text.ParseException;
  7 +import java.text.SimpleDateFormat;
  8 +import java.util.Date;
  9 +import java.util.HashMap;
  10 +import java.util.LinkedHashMap;
  11 +import java.util.Map;
  12 +import java.util.Set;
  13 +import java.util.UUID;
  14 +
  15 +import org.slf4j.Logger;
  16 +import org.slf4j.LoggerFactory;
  17 +import org.springframework.beans.factory.annotation.Autowired;
  18 +import org.springframework.web.bind.annotation.PathVariable;
  19 +import org.springframework.web.bind.annotation.RequestBody;
  20 +import org.springframework.web.bind.annotation.RequestMapping;
  21 +import org.springframework.web.bind.annotation.RequestMethod;
  22 +import org.springframework.web.bind.annotation.RestController;
  23 +
  24 +import com.alibaba.fastjson.JSONObject;
  25 +import com.junction.cache.CacheUtil;
  26 +import com.junction.pojo.CameraPojo;
  27 +import com.junction.pojo.Config;
  28 +import com.junction.thread.CameraThread;
  29 +import com.junction.util.Utils;
  30 +
  31 +/**
  32 + * @Title CameraController.java
  33 + * @description controller
  34 + * @time 2019年12月16日 上午9:00:27
  35 + * @author wuguodong
  36 + **/
  37 +
  38 +@RestController
  39 +public class CameraController {
  40 +
  41 + private final static Logger logger = LoggerFactory.getLogger(CameraController.class);
  42 +
  43 + @Autowired
  44 + public Config config;// 配置文件bean
  45 +
  46 + // 存放任务 线程
  47 + public static Map<String, CameraThread.MyRunnable> JOBMAP = new HashMap<String, CameraThread.MyRunnable>();
  48 +
  49 + /**
  50 + * @Title: openCamera
  51 + * @Description: 开启视频流
  52 + * @param ip
  53 + * @param username
  54 + * @param password
  55 + * @param channel 通道
  56 + * @param stream 码流
  57 + * @param starttime
  58 + * @param endtime
  59 + * @return Map<String,String>
  60 + **/
  61 + @RequestMapping(value = "/cameras", method = RequestMethod.POST)
  62 + public Map<String, Object> openCamera(@RequestBody CameraPojo pojo) {
  63 + // 返回结果
  64 + Map<String, Object> map = new LinkedHashMap<String, Object>();
  65 + // openStream返回结果
  66 + Map<String, Object> openMap = new HashMap<>();
  67 + JSONObject cameraJson = JSONObject.parseObject(JSONObject.toJSON(pojo).toString());
  68 + // 需要校验非空的参数
  69 + String[] isNullArr = { "ip", "username", "password", "channel", "stream" };
  70 + // 空值校验
  71 + if (!Utils.isNullParameters(cameraJson, isNullArr)) {
  72 + map.put("msg", "输入参数不完整");
  73 + map.put("code", 1);
  74 + return map;
  75 + }
  76 + // ip格式校验
  77 + if (!Utils.isTrueIp(pojo.getIp())) {
  78 + map.put("msg", "ip格式输入错误");
  79 + map.put("code", 2);
  80 + return map;
  81 + }
  82 + if (null != pojo.getStarttime() || "".equals(pojo.getStarttime())) {
  83 + // 开始时间校验
  84 + if (!Utils.isTrueTime(pojo.getStarttime())) {
  85 + map.put("msg", "starttime格式输入错误");
  86 + map.put("code", 3);
  87 + return map;
  88 + }
  89 + if (null != pojo.getEndtime() || "".equals(pojo.getEndtime())) {
  90 + if (!Utils.isTrueTime(pojo.getEndtime())) {
  91 + map.put("msg", "endtime格式输入错误");
  92 + map.put("code", 4);
  93 + return map;
  94 + }
  95 + // 结束时间要大于开始时间
  96 + try {
  97 + long starttime = new SimpleDateFormat("yyyy-MM-dd HH:ss:mm").parse(pojo.getStarttime()).getTime();
  98 + long endtime = new SimpleDateFormat("yyyy-MM-dd HH:ss:mm").parse(pojo.getEndtime()).getTime();
  99 + if (endtime - starttime < 0) {
  100 + map.put("msg", "endtime需要大于starttime");
  101 + map.put("code", 5);
  102 + return map;
  103 + }
  104 + } catch (ParseException e) {
  105 + logger.error(e.getMessage());
  106 + }
  107 + }
  108 + }
  109 +
  110 + CameraPojo cameraPojo = new CameraPojo();
  111 + // 获取当前时间
  112 + String opentime = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").format(new Date().getTime());
  113 + Set<String> keys = CacheUtil.STREATMAP.keySet();
  114 + // 缓存是否为空
  115 + if (0 == keys.size()) {
  116 + // 开始推流
  117 + openMap = openStream(pojo.getIp(), pojo.getUsername(), pojo.getPassword(), pojo.getChannel(),
  118 + pojo.getStream(), pojo.getStarttime(), pojo.getEndtime(), opentime);
  119 + if (Integer.parseInt(openMap.get("errorcode").toString()) == 0) {
  120 + map.put("url", ((CameraPojo) openMap.get("pojo")).getUrl());
  121 + map.put("token", ((CameraPojo) openMap.get("pojo")).getToken());
  122 + map.put("msg", "打开视频流成功");
  123 + map.put("code", 0);
  124 + } else {
  125 + map.put("msg", openMap.get("message"));
  126 + map.put("code", openMap.get("errorcode"));
  127 + }
  128 + } else {
  129 + // 是否存在的标志;false:不存在;true:存在
  130 + boolean sign = false;
  131 + if (null == pojo.getStarttime()) {// 直播流
  132 + for (String key : keys) {
  133 + if (pojo.getIp().equals(CacheUtil.STREATMAP.get(key).getIp())
  134 + && pojo.getChannel().equals(CacheUtil.STREATMAP.get(key).getChannel())
  135 + && null == CacheUtil.STREATMAP.get(key).getStarttime()) {// 存在直播流
  136 + cameraPojo = CacheUtil.STREATMAP.get(key);
  137 + sign = true;
  138 + break;
  139 + }
  140 + }
  141 + if (sign) {// 存在
  142 + cameraPojo.setCount(cameraPojo.getCount() + 1);
  143 + cameraPojo.setOpentime(opentime);
  144 + map.put("url", cameraPojo.getUrl());
  145 + map.put("token", cameraPojo.getToken());
  146 + map.put("msg", "打开视频流成功");
  147 + map.put("code", 0);
  148 + } else {
  149 + openMap = openStream(pojo.getIp(), pojo.getUsername(), pojo.getPassword(), pojo.getChannel(),
  150 + pojo.getStream(), pojo.getStarttime(), pojo.getEndtime(), opentime);
  151 + if (Integer.parseInt(openMap.get("errorcode").toString()) == 0) {
  152 + map.put("url", ((CameraPojo) openMap.get("pojo")).getUrl());
  153 + map.put("token", ((CameraPojo) openMap.get("pojo")).getToken());
  154 + map.put("msg", "打开视频流成功");
  155 + map.put("code", 0);
  156 + } else {
  157 + map.put("msg", openMap.get("message"));
  158 + map.put("code", openMap.get("errorcode"));
  159 + }
  160 + }
  161 +
  162 + } else {// 历史流
  163 + for (String key : keys) {
  164 + if (pojo.getIp().equals(CacheUtil.STREATMAP.get(key).getIp())
  165 + && CacheUtil.STREATMAP.get(key).getStarttime() != null) {// 存在历史流
  166 + sign = true;
  167 + cameraPojo = CacheUtil.STREATMAP.get(key);
  168 + break;
  169 + }
  170 + }
  171 + if (sign && cameraPojo.getCount() == 0) {
  172 + map.put("msg", "设备正在结束回放,请稍后再试");
  173 + map.put("code", 9);
  174 + } else if (sign && cameraPojo.getCount() != 0) {
  175 + map.put("msg", "设备正在进行回放,请稍后再试");
  176 + map.put("code", 8);
  177 + } else {
  178 + openMap = openStream(pojo.getIp(), pojo.getUsername(), pojo.getPassword(), pojo.getChannel(),
  179 + pojo.getStream(), pojo.getStarttime(), pojo.getEndtime(), opentime);
  180 + if (Integer.parseInt(openMap.get("errorcode").toString()) == 0) {
  181 + map.put("url", ((CameraPojo) openMap.get("pojo")).getUrl());
  182 + map.put("token", ((CameraPojo) openMap.get("pojo")).getToken());
  183 + map.put("msg", "打开视频流成功");
  184 + map.put("code", 0);
  185 + } else {
  186 + map.put("msg", openMap.get("message"));
  187 + map.put("code", openMap.get("errorcode"));
  188 + }
  189 + }
  190 + }
  191 + }
  192 + return map;
  193 + }
  194 +
  195 + /**
  196 + * @Title: openStream
  197 + * @Description: 推流器
  198 + * @param ip
  199 + * @param username
  200 + * @param password
  201 + * @param channel
  202 + * @param stream
  203 + * @param starttime
  204 + * @param endtime
  205 + * @param openTime
  206 + * @return
  207 + * @return CameraPojo
  208 + * @throws IOException
  209 + **/
  210 + private Map<String, Object> openStream(String ip, String username, String password, String channel, String stream,
  211 + String starttime, String endtime, String opentime) {
  212 + Map<String, Object> map = new HashMap<>();
  213 + CameraPojo cameraPojo = new CameraPojo();
  214 + // 生成token
  215 + String token = UUID.randomUUID().toString();
  216 + String rtsp = "";
  217 + String rtmp = "";
  218 + String IP = Utils.IpConvert(ip);
  219 + String url = "";
  220 + boolean sign = false;// 该nvr是否再回放,true:在回放;false: 没在回放
  221 + // 历史流
  222 + if (null != starttime && !"".equals(starttime)) {
  223 + if (null != endtime && !"".equals(endtime)) {
  224 + rtsp = "rtsp://" + username + ":" + password + "@" + IP + ":554/Streaming/tracks/"
  225 + + (Integer.valueOf(channel) - 32) + "01?starttime=" + Utils.getTime(starttime).substring(0, 8)
  226 + + "t" + Utils.getTime(starttime).substring(8) + "z'&'endtime="
  227 + + Utils.getTime(endtime).substring(0, 8) + "t" + Utils.getTime(endtime).substring(8) + "z";
  228 + cameraPojo.setStarttime(Utils.getTime(starttime));
  229 + cameraPojo.setEndTime(Utils.getTime(endtime));
  230 + } else {
  231 + String startTime = Utils.getStarttime(starttime);
  232 + String endTime = Utils.getEndtime(starttime);
  233 + rtsp = "rtsp://" + username + ":" + password + "@" + IP + ":554/Streaming/tracks/"
  234 + + (Integer.valueOf(channel) - 32) + "01?starttime=" + startTime.substring(0, 8) + "t"
  235 + + startTime.substring(8) + "z'&'endtime=" + endTime.substring(0, 8) + "t" + endTime.substring(8)
  236 + + "z";
  237 + cameraPojo.setStarttime(Utils.getStarttime(starttime));
  238 + cameraPojo.setEndTime(Utils.getEndtime(starttime));
  239 + }
  240 +// rtmp = "rtmp://" + Utils.IpConvert(config.getPush_host()) + ":" + config.getPush_port() + "/history/"
  241 +// + token;
  242 + rtmp = "rtmp://" + Utils.IpConvert(config.getPush_host()) + ":" + config.getPush_port() + "/history/test";
  243 + if (config.getHost_extra().equals("127.0.0.1")) {
  244 + url = rtmp;
  245 + } else {
  246 + url = "rtmp://" + Utils.IpConvert(config.getHost_extra()) + ":" + config.getPush_port() + "/history/"
  247 + + token;
  248 + }
  249 + } else {// 直播流
  250 + rtsp = "rtsp://" + username + ":" + password + "@" + IP + ":554/h264/ch" + channel + "/" + stream
  251 + + "/av_stream";
  252 + rtmp = "rtmp://" + Utils.IpConvert(config.getPush_host()) + ":" + config.getPush_port() + "/live/" + token;
  253 + if (config.getHost_extra().equals("127.0.0.1")) {
  254 + url = rtmp;
  255 + } else {
  256 + url = "rtmp://" + Utils.IpConvert(config.getHost_extra()) + ":" + config.getPush_port() + "/live/"
  257 + + token;
  258 + }
  259 + }
  260 +
  261 + cameraPojo.setUsername(username);
  262 + cameraPojo.setPassword(password);
  263 + cameraPojo.setIp(IP);
  264 + cameraPojo.setChannel(channel);
  265 + cameraPojo.setStream(stream);
  266 + cameraPojo.setRtsp(rtsp);
  267 + cameraPojo.setRtmp(rtmp);
  268 + cameraPojo.setUrl(url);
  269 + cameraPojo.setOpentime(opentime);
  270 + cameraPojo.setCount(1);
  271 + cameraPojo.setToken(token);
  272 +
  273 + // 解决ip输入错误时,grabber.start();出现阻塞无法释放grabber而导致后续推流无法进行;
  274 + Socket rtspSocket = new Socket();
  275 + Socket rtmpSocket = new Socket();
  276 +
  277 + // 建立TCP Scoket连接,超时时间1s,如果成功继续执行,否则return
  278 + try {
  279 + rtspSocket.connect(new InetSocketAddress(cameraPojo.getIp(), 554), 1000);
  280 + } catch (IOException e) {
  281 + logger.error("与拉流IP: " + cameraPojo.getIp() + " 端口: 554 建立TCP连接失败!");
  282 + map.put("pojo", cameraPojo);
  283 + map.put("errorcode", 6);
  284 + map.put("message", "与拉流IP: " + cameraPojo.getIp() + " 端口: 554 建立TCP连接失败!");
  285 + return map;
  286 + }
  287 + try {
  288 + rtmpSocket.connect(new InetSocketAddress(Utils.IpConvert(config.getPush_host()),
  289 + Integer.parseInt(config.getPush_port())), 1000);
  290 + } catch (IOException e) {
  291 + logger.error("与推流IP: " + config.getPush_host() + " 端口: " + config.getPush_port() + " 建立TCP连接失败!");
  292 + map.put("pojo", cameraPojo);
  293 + map.put("errorcode", 7);
  294 + map.put("message",
  295 + "与推流IP:" + config.getPush_host() + " 端口: " + config.getPush_port() + " 建立连接失败,请检查nginx服务");
  296 + return map;
  297 + }
  298 + // 执行任务
  299 + CameraThread.MyRunnable job = new CameraThread.MyRunnable(cameraPojo);
  300 + CameraThread.MyRunnable.es.execute(job);
  301 + JOBMAP.put(token, job);
  302 + map.put("pojo", cameraPojo);
  303 + map.put("errorcode", 0);
  304 + map.put("message", "打开视频流成功");
  305 + return map;
  306 + }
  307 +
  308 + /**
  309 + * @Title: closeCamera
  310 + * @Description:关闭视频流
  311 + * @param tokens
  312 + * @return void
  313 + **/
  314 + @RequestMapping(value = "/cameras/{tokens}", method = RequestMethod.DELETE)
  315 + public void closeCamera(@PathVariable("tokens") String tokens) {
  316 + if (null != tokens && !"".equals(tokens)) {
  317 + String[] tokenArr = tokens.split(",");
  318 + for (String token : tokenArr) {
  319 + if (JOBMAP.containsKey(token) && CacheUtil.STREATMAP.containsKey(token)) {
  320 + // 回放手动关闭
  321 + if (null != CacheUtil.STREATMAP.get(token).getStarttime()) {
  322 + if (0 == CacheUtil.STREATMAP.get(token).getCount() - 1) {
  323 + CacheUtil.PUSHMAP.get(token).setExitcode(1);
  324 + CacheUtil.STREATMAP.get(token).setCount(CacheUtil.STREATMAP.get(token).getCount() - 1);
  325 + } else {
  326 + CacheUtil.STREATMAP.get(token).setCount(CacheUtil.STREATMAP.get(token).getCount() - 1);
  327 + logger.info("当前设备正在进行回放,使用人数为" + CacheUtil.STREATMAP.get(token).getCount() + " 设备信息:[ip:"
  328 + + CacheUtil.STREATMAP.get(token).getIp() + " channel:"
  329 + + CacheUtil.STREATMAP.get(token).getChannel() + " stream:"
  330 + + CacheUtil.STREATMAP.get(token).getStream() + " statrtime:"
  331 + + CacheUtil.STREATMAP.get(token).getStream() + " endtime:"
  332 + + CacheUtil.STREATMAP.get(token).getEndtime() + " url:"
  333 + + CacheUtil.STREATMAP.get(token).getUrl() + "]");
  334 + }
  335 + } else {
  336 + if (0 < CacheUtil.STREATMAP.get(token).getCount()) {
  337 + // 人数-1
  338 + CacheUtil.STREATMAP.get(token).setCount(CacheUtil.STREATMAP.get(token).getCount() - 1);
  339 + logger.info("关闭成功 当前设备使用人数为" + CacheUtil.STREATMAP.get(token).getCount() + " 设备信息:[ip:"
  340 + + CacheUtil.STREATMAP.get(token).getIp() + " channel:"
  341 + + CacheUtil.STREATMAP.get(token).getChannel() + " stream:"
  342 + + CacheUtil.STREATMAP.get(token).getStream() + " statrtime:"
  343 + + CacheUtil.STREATMAP.get(token).getStream() + " endtime:"
  344 + + CacheUtil.STREATMAP.get(token).getEndtime() + " url:"
  345 + + CacheUtil.STREATMAP.get(token).getUrl() + "]");
  346 + }
  347 + }
  348 +
  349 + }
  350 + }
  351 + }
  352 + }
  353 +
  354 + /**
  355 + * @Title: getCameras
  356 + * @Description:获取视频流
  357 + * @return Map<String, CameraPojo>
  358 + **/
  359 + @RequestMapping(value = "/cameras", method = RequestMethod.GET)
  360 + public Map<String, CameraPojo> getCameras() {
  361 + logger.info("获取视频流信息:" + CacheUtil.STREATMAP.toString());
  362 + return CacheUtil.STREATMAP;
  363 + }
  364 +
  365 + /**
  366 + * @Title: keepAlive
  367 + * @Description:视频流保活
  368 + * @param tokens
  369 + * @return void
  370 + **/
  371 + @RequestMapping(value = "/cameras/{tokens}", method = RequestMethod.PUT)
  372 + public void keepAlive(@PathVariable("tokens") String tokens) {
  373 + // 校验参数
  374 + if (null != tokens && !"".equals(tokens)) {
  375 + String[] tokenArr = tokens.split(",");
  376 + for (String token : tokenArr) {
  377 + CameraPojo cameraPojo = new CameraPojo();
  378 + // 直播流token
  379 + if (null != CacheUtil.STREATMAP.get(token)) {
  380 + cameraPojo = CacheUtil.STREATMAP.get(token);
  381 + // 更新当前系统时间
  382 + cameraPojo.setOpentime(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").format(new Date().getTime()));
  383 + logger.info("保活成功 设备信息:[ip:" + cameraPojo.getIp() + " channel:" + cameraPojo.getChannel()
  384 + + " stream:" + cameraPojo.getStream() + " starttime:" + cameraPojo.getStarttime()
  385 + + " endtime:" + cameraPojo.getEndtime() + " url:" + cameraPojo.getUrl() + "]");
  386 + }
  387 + }
  388 + }
  389 + }
  390 +
  391 + /**
  392 + * @Title: getConfig
  393 + * @Description: 获取服务信息
  394 + * @return Map<String, Object>
  395 + **/
  396 + @RequestMapping(value = "/status", method = RequestMethod.GET)
  397 + public Map<String, Object> getConfig() {
  398 + // 获取当前时间
  399 + long nowTime = new Date().getTime();
  400 + String upTime = (nowTime - CacheUtil.STARTTIME) / (1000 * 60 * 60) + "h"
  401 + + (nowTime - CacheUtil.STARTTIME) % (1000 * 60 * 60) / (1000 * 60) + "m"
  402 + + (nowTime - CacheUtil.STARTTIME) % (1000 * 60 * 60) / (1000) + "s";
  403 + logger.info("获取服务信息:" + config.toString() + ";服务运行时间:" + upTime);
  404 + Map<String, Object> status = new HashMap<String, Object>();
  405 + status.put("config", config);
  406 + status.put("uptime", upTime);
  407 + return status;
  408 + }
  409 +
  410 +}
  1 +package com.junction.pojo;
  2 +
  3 +import java.io.Serializable;
  4 +
  5 +public class CameraPojo implements Serializable {
  6 + private static final long serialVersionUID = 8183688502930584159L;
  7 + private String username;// 摄像头账号
  8 + private String password;// 摄像头密码
  9 + private String ip;// 摄像头ip
  10 + private String channel;// 摄像头通道
  11 + private String stream;// 摄像头码流
  12 + private String rtsp;// rtsp地址
  13 + private String rtmp;// rtmp地址
  14 + private String url;// 播放地址
  15 + private String starttime;// 回放开始时间
  16 + private String endtime;// 回放结束时间
  17 + private String opentime;// 打开时间
  18 + private int count = 0;// 使用人数
  19 + private String token;
  20 +
  21 + public String getUsername() {
  22 + return username;
  23 + }
  24 +
  25 + public void setUsername(String username) {
  26 + this.username = username;
  27 + }
  28 +
  29 + public String getPassword() {
  30 + return password;
  31 + }
  32 +
  33 + public void setPassword(String password) {
  34 + this.password = password;
  35 + }
  36 +
  37 + public String getIp() {
  38 + return ip;
  39 + }
  40 +
  41 + public void setIp(String ip) {
  42 + this.ip = ip;
  43 + }
  44 +
  45 + public String getChannel() {
  46 + return channel;
  47 + }
  48 +
  49 + public void setChannel(String channel) {
  50 + this.channel = channel;
  51 + }
  52 +
  53 + public String getStream() {
  54 + return stream;
  55 + }
  56 +
  57 + public void setStream(String stream) {
  58 + this.stream = stream;
  59 + }
  60 +
  61 + public String getRtsp() {
  62 + return rtsp;
  63 + }
  64 +
  65 + public void setRtsp(String rtsp) {
  66 + this.rtsp = rtsp;
  67 + }
  68 +
  69 + public String getRtmp() {
  70 + return rtmp;
  71 + }
  72 +
  73 + public void setRtmp(String rtmp) {
  74 + this.rtmp = rtmp;
  75 + }
  76 +
  77 + public String getStarttime() {
  78 + return starttime;
  79 + }
  80 +
  81 + public void setStarttime(String starttime) {
  82 + this.starttime = starttime;
  83 + }
  84 +
  85 + public String getEndtime() {
  86 + return endtime;
  87 + }
  88 +
  89 + public void setEndTime(String endtime) {
  90 + this.endtime = endtime;
  91 + }
  92 +
  93 + public String getOpentime() {
  94 + return opentime;
  95 + }
  96 +
  97 + public void setOpentime(String opentime) {
  98 + this.opentime = opentime;
  99 + }
  100 +
  101 + public int getCount() {
  102 + return count;
  103 + }
  104 +
  105 + public void setCount(int count) {
  106 + this.count = count;
  107 + }
  108 +
  109 + public String getToken() {
  110 + return token;
  111 + }
  112 +
  113 + public void setToken(String token) {
  114 + this.token = token;
  115 + }
  116 +
  117 + public String getUrl() {
  118 + return url;
  119 + }
  120 +
  121 + public void setUrl(String url) {
  122 + this.url = url;
  123 + }
  124 +
  125 + @Override
  126 + public String toString() {
  127 + return "CameraPojo [username=" + username + ", password=" + password + ", ip=" + ip + ", channel=" + channel
  128 + + ", stream=" + stream + ", rtsp=" + rtsp + ", rtmp=" + rtmp + ", url=" + url + ", starttime="
  129 + + starttime + ", endtime=" + endtime + ", opentime=" + opentime + ", count=" + count + ", token="
  130 + + token + "]";
  131 + }
  132 +
  133 +}
  1 +package com.junction.pojo;
  2 +
  3 +import org.springframework.boot.context.properties.ConfigurationProperties;
  4 +import org.springframework.stereotype.Component;
  5 +
  6 +/**
  7 + * @Title ConfigPojo.java
  8 + * @description 读取配置文件的bean
  9 + * @time 2019年12月25日 下午5:11:21
  10 + * @author wuguodong
  11 + **/
  12 +@Component
  13 +@ConfigurationProperties(prefix = "config")
  14 +public class Config {
  15 + private String keepalive;// 保活时长(分钟)
  16 + private String push_host;// 推送地址
  17 + private String host_extra;// 额外地址
  18 + private String push_port;// 推送端口
  19 + private String main_code;// 主码流最大码率
  20 + private String sub_code;// 主码流最大码率
  21 + private String version;// 版本信息
  22 +
  23 + public String getHost_extra() {
  24 + return host_extra;
  25 + }
  26 +
  27 + public void setHost_extra(String host_extra) {
  28 + this.host_extra = host_extra;
  29 + }
  30 +
  31 + public String getKeepalive() {
  32 + return keepalive;
  33 + }
  34 +
  35 + public void setKeepalive(String keepalive) {
  36 + this.keepalive = keepalive;
  37 + }
  38 +
  39 + public String getPush_host() {
  40 + return push_host;
  41 + }
  42 +
  43 + public void setPush_host(String push_host) {
  44 + this.push_host = push_host;
  45 + }
  46 +
  47 + public String getPush_port() {
  48 + return push_port;
  49 + }
  50 +
  51 + public void setPush_port(String push_port) {
  52 + this.push_port = push_port;
  53 + }
  54 +
  55 + public String getMain_code() {
  56 + return main_code;
  57 + }
  58 +
  59 + public void setMain_code(String main_code) {
  60 + this.main_code = main_code;
  61 + }
  62 +
  63 + public String getSub_code() {
  64 + return sub_code;
  65 + }
  66 +
  67 + public void setSub_code(String sub_code) {
  68 + this.sub_code = sub_code;
  69 + }
  70 +
  71 + public String getVersion() {
  72 + return version;
  73 + }
  74 +
  75 + public void setVersion(String version) {
  76 + this.version = version;
  77 + }
  78 +
  79 + @Override
  80 + public String toString() {
  81 + return "Config [keepalive=" + keepalive + ", push_host=" + push_host + ", host_extra=" + host_extra
  82 + + ", push_port=" + push_port + ", main_code=" + main_code + ", sub_code=" + sub_code + ", version="
  83 + + version + "]";
  84 + }
  85 +
  86 +}
  1 +package com.junction.push;
  2 +
  3 +import static org.bytedeco.ffmpeg.global.avcodec.av_packet_unref;
  4 +
  5 +import java.util.HashMap;
  6 +import java.util.Map;
  7 +
  8 +import org.bytedeco.ffmpeg.avcodec.AVPacket;
  9 +import org.bytedeco.ffmpeg.avformat.AVFormatContext;
  10 +import org.bytedeco.ffmpeg.global.avcodec;
  11 +import org.bytedeco.ffmpeg.global.avutil;
  12 +import org.bytedeco.javacv.FFmpegFrameGrabber;
  13 +import org.bytedeco.javacv.FFmpegFrameRecorder;
  14 +import org.bytedeco.javacv.FFmpegLogCallback;
  15 +import org.slf4j.Logger;
  16 +import org.slf4j.LoggerFactory;
  17 +import org.springframework.context.ApplicationContext;
  18 +
  19 +import com.junction.pojo.CameraPojo;
  20 +import com.junction.pojo.Config;
  21 +
  22 +/**
  23 + * @Title RtmpPush.java
  24 + * @description javacv推数据帧
  25 + * @time 2020年3月17日 下午2:32:42
  26 + * @author wuguodong
  27 + **/
  28 +public class CameraPush {
  29 + private final static Logger logger = LoggerFactory.getLogger(CameraPush.class);
  30 + // 配置类
  31 + private static Config config;
  32 +
  33 + // 通过applicationContext上下文获取Config类
  34 + public static void setApplicationContext(ApplicationContext applicationContext) {
  35 + config = applicationContext.getBean(Config.class);
  36 + }
  37 +
  38 + private CameraPojo pojo;// 设备信息
  39 + private FFmpegFrameRecorder recorder;// 解码器
  40 + private FFmpegFrameGrabber grabber;// 采集器
  41 + private int err_index = 0;// 推流过程中出现错误的次数
  42 + private int exitcode = 0;// 退出状态码:0-正常退出;1-手动中断;
  43 + private double framerate = 0;// 帧率
  44 +
  45 + public void setExitcode(int exitcode) {
  46 + this.exitcode = exitcode;
  47 + }
  48 +
  49 + public int getExitcode() {
  50 + return exitcode;
  51 + }
  52 +
  53 + public CameraPush(CameraPojo cameraPojo) {
  54 + this.pojo = cameraPojo;
  55 + }
  56 +
  57 + /**
  58 + * @Title: release
  59 + * @Description:资源释放
  60 + * @return void
  61 + **/
  62 + public void release() {
  63 + try {
  64 + grabber.stop();
  65 + grabber.close();
  66 + if (recorder != null) {
  67 + recorder.stop();
  68 + recorder.release();
  69 + }
  70 + } catch (Exception e) {
  71 + e.printStackTrace();
  72 + }
  73 + }
  74 +
  75 + /**
  76 + * @Title: push
  77 + * @Description:推送视频流数据包
  78 + * @return void
  79 + **/
  80 + public void push() {
  81 + try {
  82 + avutil.av_log_set_level(avutil.AV_LOG_INFO);
  83 + FFmpegLogCallback.set();
  84 + grabber = new FFmpegFrameGrabber(pojo.getRtsp());
  85 + grabber.setOption("rtsp_transport", "tcp");
  86 + // 设置采集器构造超时时间
  87 + grabber.setOption("stimeout", "2000000");
  88 + if ("sub".equals(pojo.getStream())) {
  89 + grabber.start(config.getSub_code());
  90 + } else if ("main".equals(pojo.getStream())) {
  91 + grabber.start(config.getMain_code());
  92 + } else {
  93 + grabber.start(config.getMain_code());
  94 + }
  95 +
  96 + // 部分监控设备流信息里携带的帧率为9000,如出现此问题,会导致dts、pts时间戳计算失败,播放器无法播放,故出现错误的帧率时,默认为25帧
  97 + if (grabber.getFrameRate() > 0 && grabber.getFrameRate() < 100) {
  98 + framerate = grabber.getFrameRate();
  99 + } else {
  100 + framerate = 25.0;
  101 + }
  102 + int width = grabber.getImageWidth();
  103 + int height = grabber.getImageHeight();
  104 + // 若视频像素值为0,说明拉流异常,程序结束
  105 + if (width == 0 && height == 0) {
  106 + logger.error(pojo.getRtsp() + " 拉流异常!");
  107 + grabber.stop();
  108 + grabber.close();
  109 + release();
  110 + return;
  111 + }
  112 + recorder = new FFmpegFrameRecorder(pojo.getRtmp(), grabber.getImageWidth(), grabber.getImageHeight());
  113 + recorder.setInterleaved(true);
  114 + // 关键帧间隔,一般与帧率相同或者是视频帧率的两倍
  115 + recorder.setGopSize((int) framerate * 2);
  116 + // 视频帧率(保证视频质量的情况下最低25,低于25会出现闪屏)
  117 + recorder.setFrameRate(framerate);
  118 + // 设置比特率
  119 + recorder.setVideoBitrate(grabber.getVideoBitrate());
  120 + // 封装flv格式
  121 + recorder.setFormat("flv");
  122 + // h264编/解码器
  123 + recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
  124 + recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
  125 + Map<String, String> videoOption = new HashMap<>();
  126 +
  127 + // 该参数用于降低延迟
  128 + videoOption.put("tune", "zerolatency");
  129 + /**
  130 + ** 权衡quality(视频质量)和encode speed(编码速度) values(值): *
  131 + * ultrafast(终极快),superfast(超级快), veryfast(非常快), faster(很快), fast(快), *
  132 + * medium(中等), slow(慢), slower(很慢), veryslow(非常慢) *
  133 + * ultrafast(终极快)提供最少的压缩(低编码器CPU)和最大的视频流大小;而veryslow(非常慢)提供最佳的压缩(高编码器CPU)的同时降低视频流的大小
  134 + */
  135 + videoOption.put("preset", "ultrafast");
  136 + // 画面质量参数,0~51;18~28是一个合理范围
  137 + videoOption.put("crf", "28");
  138 + recorder.setOptions(videoOption);
  139 + AVFormatContext fc = grabber.getFormatContext();
  140 + recorder.start(fc);
  141 + logger.debug("开始推流 设备信息:[ip:" + pojo.getIp() + " channel:" + pojo.getChannel() + " stream:"
  142 + + pojo.getStream() + " starttime:" + pojo.getStarttime() + " endtime:" + pojo.getEndtime()
  143 + + " rtsp:" + pojo.getRtsp() + " url:" + pojo.getUrl() + "]");
  144 + // 清空探测时留下的缓存
  145 + grabber.flush();
  146 +
  147 + AVPacket pkt = null;
  148 + long dts = 0;
  149 + long pts = 0;
  150 + int timebase = 0;
  151 + for (int no_frame_index = 0; no_frame_index < 5 && err_index < 5;) {
  152 + long time1 = System.currentTimeMillis();
  153 + if (exitcode == 1) {
  154 + break;
  155 + }
  156 + pkt = grabber.grabPacket();
  157 + if (pkt == null || pkt.size() == 0 || pkt.data() == null) {
  158 + // 空包记录次数跳过
  159 + logger.warn("JavaCV 出现空包 设备信息:[ip:" + pojo.getIp() + " channel:" + pojo.getChannel() + " stream:"
  160 + + pojo.getStream() + " starttime:" + pojo.getStarttime() + " endtime:" + " rtsp:"
  161 + + pojo.getRtsp() + pojo.getEndtime() + " url:" + pojo.getUrl() + "]");
  162 + no_frame_index++;
  163 + continue;
  164 + }
  165 + // 过滤音频
  166 + if (pkt.stream_index() == 1) {
  167 + av_packet_unref(pkt);
  168 + continue;
  169 + }
  170 +
  171 + // 矫正sdk回调数据的dts,pts每次不从0开始累加所导致的播放器无法续播问题
  172 + pkt.pts(pts);
  173 + pkt.dts(dts);
  174 + err_index += (recorder.recordPacket(pkt) ? 0 : 1);
  175 + // pts,dts累加
  176 + timebase = grabber.getFormatContext().streams(pkt.stream_index()).time_base().den();
  177 + pts += timebase / (int) framerate;
  178 + dts += timebase / (int) framerate;
  179 + // 将缓存空间的引用计数-1,并将Packet中的其他字段设为初始值。如果引用计数为0,自动的释放缓存空间。
  180 + av_packet_unref(pkt);
  181 +
  182 + long endtime = System.currentTimeMillis();
  183 + if ((long) (1000 /framerate) - (endtime - time1) > 0) {
  184 + Thread.sleep((long) (1000 / framerate) - (endtime - time1));
  185 + }
  186 + }
  187 + } catch (Exception e) {
  188 + e.printStackTrace();
  189 + logger.error(e.getMessage());
  190 + } finally {
  191 + release();
  192 + logger.info("推流结束 设备信息:[ip:" + pojo.getIp() + " channel:" + pojo.getChannel() + " stream:"
  193 + + pojo.getStream() + " starttime:" + pojo.getStarttime() + " endtime:" + pojo.getEndtime()
  194 + + " rtsp:" + pojo.getRtsp() + " url:" + pojo.getUrl() + "]");
  195 + }
  196 + }
  197 +}
  1 +package com.junction.thread;
  2 +
  3 +import java.util.concurrent.ExecutorService;
  4 +import java.util.concurrent.Executors;
  5 +
  6 +import org.slf4j.Logger;
  7 +import org.slf4j.LoggerFactory;
  8 +
  9 +import com.junction.cache.CacheUtil;
  10 +import com.junction.controller.CameraController;
  11 +import com.junction.pojo.CameraPojo;
  12 +import com.junction.push.CameraPush;
  13 +
  14 +/**
  15 + * @Title CameraThread.java
  16 + * @description TODO
  17 + * @time 2019年12月16日 上午9:32:43
  18 + * @author wuguodong
  19 + **/
  20 +public class CameraThread {
  21 +
  22 + private final static Logger logger = LoggerFactory.getLogger(CameraThread.class);
  23 +
  24 + public static class MyRunnable implements Runnable {
  25 +
  26 + // 创建线程池
  27 + public static ExecutorService es = Executors.newCachedThreadPool();
  28 +
  29 + private CameraPojo cameraPojo;
  30 + private Thread nowThread;
  31 +
  32 + public MyRunnable(CameraPojo cameraPojo) {
  33 + this.cameraPojo = cameraPojo;
  34 + }
  35 +
  36 + // 中断线程
  37 + public void setInterrupted(String key) {
  38 + CacheUtil.PUSHMAP.get(key).setExitcode(1);
  39 + }
  40 +
  41 + @Override
  42 + public void run() {
  43 + // 直播流
  44 + try {
  45 + // 获取当前线程存入缓存
  46 + nowThread = Thread.currentThread();
  47 + CacheUtil.STREATMAP.put(cameraPojo.getToken(), cameraPojo);
  48 + // 执行转流推流任务
  49 + CameraPush push = new CameraPush(cameraPojo);
  50 + CacheUtil.PUSHMAP.put(cameraPojo.getToken(), push);
  51 + push.push();
  52 + // 清除缓存
  53 + CacheUtil.STREATMAP.remove(cameraPojo.getToken());
  54 + CameraController.JOBMAP.remove(cameraPojo.getToken());
  55 + CacheUtil.PUSHMAP.remove(cameraPojo.getToken());
  56 + } catch (Exception e) {
  57 + CacheUtil.STREATMAP.remove(cameraPojo.getToken());
  58 + CameraController.JOBMAP.remove(cameraPojo.getToken());
  59 + CacheUtil.PUSHMAP.remove(cameraPojo.getToken());
  60 + }
  61 + }
  62 + }
  63 +}
  1 +package com.junction.timer;
  2 +
  3 +import java.text.ParseException;
  4 +import java.text.SimpleDateFormat;
  5 +import java.util.Date;
  6 +import java.util.Set;
  7 +import java.util.Timer;
  8 +import java.util.TimerTask;
  9 +
  10 +import org.slf4j.Logger;
  11 +import org.slf4j.LoggerFactory;
  12 +import org.springframework.beans.factory.annotation.Autowired;
  13 +import org.springframework.boot.CommandLineRunner;
  14 +import org.springframework.stereotype.Component;
  15 +
  16 +import com.junction.cache.CacheUtil;
  17 +import com.junction.controller.CameraController;
  18 +import com.junction.pojo.Config;
  19 +
  20 +/**
  21 + * @Title TimerUtil.java
  22 + * @description 定时任务
  23 + * @time 2019年12月16日 下午3:10:08
  24 + * @author wuguodong
  25 + **/
  26 +@Component
  27 +public class CameraTimer implements CommandLineRunner {
  28 +
  29 + private final static Logger logger = LoggerFactory.getLogger(CameraTimer.class);
  30 +
  31 + @Autowired
  32 + private Config config;// 配置文件bean
  33 +
  34 + public static Timer timer;
  35 +
  36 + @Override
  37 + public void run(String... args) throws Exception {
  38 + // 超过5分钟,结束推流
  39 + timer = new Timer("timeTimer");
  40 + timer.schedule(new TimerTask() {
  41 + @Override
  42 + public void run() {
  43 + logger.info("定时任务 当前有" + CameraController.JOBMAP.size() + "个推流任务正在进行推流");
  44 + // 管理缓存
  45 + if (null != CacheUtil.STREATMAP && 0 != CacheUtil.STREATMAP.size()) {
  46 + Set<String> keys = CacheUtil.STREATMAP.keySet();
  47 + for (String key : keys) {
  48 + try {
  49 + // 最后打开时间
  50 + long openTime = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss")
  51 + .parse(CacheUtil.STREATMAP.get(key).getOpentime()).getTime();
  52 + // 当前系统时间
  53 + long newTime = new Date().getTime();
  54 +
  55 + // 如果通道使用人数为0,则关闭推流
  56 + if (CacheUtil.STREATMAP.get(key).getCount() == 0) {
  57 + // 结束线程
  58 + CameraController.JOBMAP.get(key).setInterrupted(key);
  59 + logger.info("定时任务 当前设备使用人数为0结束推流 设备信息:[ip:" + CacheUtil.STREATMAP.get(key).getIp()
  60 + + " channel:" + CacheUtil.STREATMAP.get(key).getChannel() + " stream:"
  61 + + CacheUtil.STREATMAP.get(key).getStream() + " starttime:"
  62 + + CacheUtil.STREATMAP.get(key).getStarttime() + " endtime:"
  63 + + CacheUtil.STREATMAP.get(key).getEndtime() + " rtsp:"
  64 + + CacheUtil.STREATMAP.get(key).getRtsp() + " url:"
  65 + + CacheUtil.STREATMAP.get(key).getUrl() + "]");
  66 + } else if (null == CacheUtil.STREATMAP.get(key).getStarttime()
  67 + && (newTime - openTime) / 1000 / 60 >= Integer.valueOf(config.getKeepalive())) {
  68 + CameraController.JOBMAP.get(key).setInterrupted(key);
  69 + logger.info("定时任务 当前设备使用时间超时结束推流 设备信息:[ip:" + CacheUtil.STREATMAP.get(key).getIp()
  70 + + " channel:" + CacheUtil.STREATMAP.get(key).getChannel() + " stream:"
  71 + + CacheUtil.STREATMAP.get(key).getStream() + " starttime:"
  72 + + CacheUtil.STREATMAP.get(key).getStarttime() + " endtime:"
  73 + + CacheUtil.STREATMAP.get(key).getEndtime() + " rtsp:"
  74 + + CacheUtil.STREATMAP.get(key).getRtsp() + " url:"
  75 + + CacheUtil.STREATMAP.get(key).getUrl() + "]");
  76 + }
  77 + } catch (ParseException e) {
  78 + e.printStackTrace();
  79 + }
  80 + }
  81 + }
  82 + }
  83 + }, 1, 1000 * 60);
  84 + }
  85 +}
  1 +package com.junction.util;
  2 +
  3 +import java.net.InetAddress;
  4 +import java.net.UnknownHostException;
  5 +import java.text.SimpleDateFormat;
  6 +import java.util.Date;
  7 +import java.util.HashMap;
  8 +import java.util.Map;
  9 +
  10 +import org.slf4j.Logger;
  11 +import org.slf4j.LoggerFactory;
  12 +
  13 +import com.alibaba.fastjson.JSONObject;
  14 +
  15 +/**
  16 + * @Title Utils.java
  17 + * @description 工具类
  18 + * @time 2020年10月27日 上午9:15:56
  19 + * @author wuguodong
  20 + **/
  21 +public class Utils {
  22 + private final static Logger logger = LoggerFactory.getLogger(Utils.class);
  23 +
  24 + /**
  25 + * @Title: IpConvert
  26 + * @Description:域名转ip
  27 + * @param domainName
  28 + * @return ip
  29 + **/
  30 + public static String IpConvert(String domainName) {
  31 + String ip = domainName;
  32 + try {
  33 + ip = InetAddress.getByName(domainName).getHostAddress();
  34 + } catch (UnknownHostException e) {
  35 + e.printStackTrace();
  36 + return domainName;
  37 + }
  38 + return ip;
  39 + }
  40 +
  41 + /**
  42 + * @Title: CheckParameters
  43 + * @Description:接口参数非空校验
  44 + * @param cameraJson
  45 + * @param isNullArr
  46 + * @return boolean
  47 + **/
  48 + public static boolean isNullParameters(JSONObject cameraJson, String[] isNullArr) {
  49 + Map<String, Object> checkMap = new HashMap<>();
  50 + // 空值校验
  51 + for (String key : isNullArr) {
  52 + if (null == cameraJson.get(key) || "".equals(cameraJson.get(key))) {
  53 + return false;
  54 + }
  55 + }
  56 + return true;
  57 + }
  58 +
  59 + /**
  60 + * @Title: isTrueIp
  61 + * @Description:接口参数ip格式校验
  62 + * @param ip
  63 + * @return boolean
  64 + **/
  65 + public static boolean isTrueIp(String ip) {
  66 + return ip.matches("([1-9]|[1-9]\\d|1\\d{2}|2[0-4]\\d|25[0-5])(\\.(\\d|[1-9]\\d|1\\d{2}|2[0-4]\\d|25[0-5])){3}");
  67 + }
  68 +
  69 + /**
  70 + * @Title: isTrueTime
  71 + * @Description:接口参数时间格式校验
  72 + * @param time
  73 + * @return boolean
  74 + **/
  75 + public static boolean isTrueTime(String time) {
  76 + try {
  77 + new SimpleDateFormat("yyyy-MM-dd HH:ss:mm").parse(time);
  78 + return true;
  79 + } catch (Exception e) {
  80 + logger.error(e.getMessage());
  81 + return false;
  82 + }
  83 + }
  84 +
  85 + /**
  86 + * @Title: getTime
  87 + * @Description:获取转换后的时间
  88 + * @param time
  89 + * @return String
  90 + **/
  91 + public static String getTime(String time) {
  92 + String timestamp = null;
  93 + try {
  94 + timestamp = new SimpleDateFormat("yyyyMMddHHmmss")
  95 + .format(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(time));
  96 + } catch (Exception e) {
  97 + logger.error("时间格式化错误");
  98 + e.printStackTrace();
  99 + }
  100 + return timestamp;
  101 + }
  102 +
  103 + /**
  104 + * @Title: getStarttime
  105 + * @Description:获取回放开始时间
  106 + * @param starttime
  107 + * @return starttime
  108 + **/
  109 + public static String getStarttime(String time) {
  110 + String starttime = null;
  111 + try {
  112 + starttime = new SimpleDateFormat("yyyyMMddHHmmss")
  113 + .format(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(time).getTime() - 60 * 1000);
  114 + } catch (Exception e) {
  115 + logger.error("时间格式化错误");
  116 + e.printStackTrace();
  117 + }
  118 + return starttime;
  119 + }
  120 +
  121 + /**
  122 + * @Title: getEndtime
  123 + * @Description:获取回放结束时间
  124 + * @param endString
  125 + * @return endString
  126 + **/
  127 + public static String getEndtime(String time) {
  128 + String endString = null;
  129 + try {
  130 + endString = new SimpleDateFormat("yyyyMMddHHmmss")
  131 + .format(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(time).getTime() + 60 * 1000);
  132 + } catch (Exception e) {
  133 + logger.error("时间格式化错误");
  134 + e.printStackTrace();
  135 + }
  136 + return endString;
  137 + }
  138 +}
  1 +package org.bytedeco.javacv;
  2 +
  3 +import static org.bytedeco.ffmpeg.global.avcodec.AV_PKT_FLAG_KEY;
  4 +import static org.bytedeco.ffmpeg.global.avcodec.av_jni_set_java_vm;
  5 +import static org.bytedeco.ffmpeg.global.avcodec.av_packet_unref;
  6 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_alloc_context3;
  7 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_decode_audio4;
  8 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_decode_video2;
  9 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_find_decoder;
  10 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_find_decoder_by_name;
  11 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_flush_buffers;
  12 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_free_context;
  13 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_open2;
  14 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_parameters_to_context;
  15 +import static org.bytedeco.ffmpeg.global.avcodec.avcodec_register_all;
  16 +import static org.bytedeco.ffmpeg.global.avdevice.avdevice_register_all;
  17 +import static org.bytedeco.ffmpeg.global.avformat.AVSEEK_FLAG_BACKWARD;
  18 +import static org.bytedeco.ffmpeg.global.avformat.AVSEEK_SIZE;
  19 +import static org.bytedeco.ffmpeg.global.avformat.av_dump_format;
  20 +import static org.bytedeco.ffmpeg.global.avformat.av_find_input_format;
  21 +import static org.bytedeco.ffmpeg.global.avformat.av_guess_sample_aspect_ratio;
  22 +import static org.bytedeco.ffmpeg.global.avformat.av_read_frame;
  23 +import static org.bytedeco.ffmpeg.global.avformat.av_register_all;
  24 +import static org.bytedeco.ffmpeg.global.avformat.avformat_alloc_context;
  25 +import static org.bytedeco.ffmpeg.global.avformat.avformat_close_input;
  26 +import static org.bytedeco.ffmpeg.global.avformat.avformat_find_stream_info;
  27 +import static org.bytedeco.ffmpeg.global.avformat.avformat_free_context;
  28 +import static org.bytedeco.ffmpeg.global.avformat.avformat_network_init;
  29 +import static org.bytedeco.ffmpeg.global.avformat.avformat_open_input;
  30 +import static org.bytedeco.ffmpeg.global.avformat.avformat_seek_file;
  31 +import static org.bytedeco.ffmpeg.global.avformat.avio_alloc_context;
  32 +import static org.bytedeco.ffmpeg.global.avutil.AVMEDIA_TYPE_AUDIO;
  33 +import static org.bytedeco.ffmpeg.global.avutil.AVMEDIA_TYPE_VIDEO;
  34 +import static org.bytedeco.ffmpeg.global.avutil.AV_DICT_IGNORE_SUFFIX;
  35 +import static org.bytedeco.ffmpeg.global.avutil.AV_LOG_INFO;
  36 +import static org.bytedeco.ffmpeg.global.avutil.AV_NOPTS_VALUE;
  37 +import static org.bytedeco.ffmpeg.global.avutil.AV_PICTURE_TYPE_I;
  38 +import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_BGR24;
  39 +import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_GRAY8;
  40 +import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_NONE;
  41 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_DBL;
  42 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_DBLP;
  43 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_FLT;
  44 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_FLTP;
  45 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_NONE;
  46 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_S16;
  47 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_S16P;
  48 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_S32;
  49 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_S32P;
  50 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_U8;
  51 +import static org.bytedeco.ffmpeg.global.avutil.AV_SAMPLE_FMT_U8P;
  52 +import static org.bytedeco.ffmpeg.global.avutil.AV_TIME_BASE;
  53 +import static org.bytedeco.ffmpeg.global.avutil.av_d2q;
  54 +import static org.bytedeco.ffmpeg.global.avutil.av_dict_free;
  55 +import static org.bytedeco.ffmpeg.global.avutil.av_dict_get;
  56 +import static org.bytedeco.ffmpeg.global.avutil.av_dict_set;
  57 +import static org.bytedeco.ffmpeg.global.avutil.av_frame_alloc;
  58 +import static org.bytedeco.ffmpeg.global.avutil.av_frame_free;
  59 +import static org.bytedeco.ffmpeg.global.avutil.av_frame_get_best_effort_timestamp;
  60 +import static org.bytedeco.ffmpeg.global.avutil.av_frame_unref;
  61 +import static org.bytedeco.ffmpeg.global.avutil.av_free;
  62 +import static org.bytedeco.ffmpeg.global.avutil.av_get_bytes_per_sample;
  63 +import static org.bytedeco.ffmpeg.global.avutil.av_get_default_channel_layout;
  64 +import static org.bytedeco.ffmpeg.global.avutil.av_get_pix_fmt_name;
  65 +import static org.bytedeco.ffmpeg.global.avutil.av_image_fill_arrays;
  66 +import static org.bytedeco.ffmpeg.global.avutil.av_image_fill_linesizes;
  67 +import static org.bytedeco.ffmpeg.global.avutil.av_image_get_buffer_size;
  68 +import static org.bytedeco.ffmpeg.global.avutil.av_log_get_level;
  69 +import static org.bytedeco.ffmpeg.global.avutil.av_malloc;
  70 +import static org.bytedeco.ffmpeg.global.avutil.av_sample_fmt_is_planar;
  71 +import static org.bytedeco.ffmpeg.global.avutil.av_samples_get_buffer_size;
  72 +import static org.bytedeco.ffmpeg.global.swresample.swr_alloc_set_opts;
  73 +import static org.bytedeco.ffmpeg.global.swresample.swr_convert;
  74 +import static org.bytedeco.ffmpeg.global.swresample.swr_free;
  75 +import static org.bytedeco.ffmpeg.global.swresample.swr_get_out_samples;
  76 +import static org.bytedeco.ffmpeg.global.swresample.swr_init;
  77 +import static org.bytedeco.ffmpeg.global.swscale.SWS_BILINEAR;
  78 +import static org.bytedeco.ffmpeg.global.swscale.sws_freeContext;
  79 +import static org.bytedeco.ffmpeg.global.swscale.sws_getCachedContext;
  80 +import static org.bytedeco.ffmpeg.global.swscale.sws_scale;
  81 +
  82 +import java.io.BufferedInputStream;
  83 +import java.io.File;
  84 +import java.io.IOException;
  85 +import java.io.InputStream;
  86 +import java.nio.Buffer;
  87 +import java.nio.ByteBuffer;
  88 +import java.util.Collections;
  89 +import java.util.EnumSet;
  90 +import java.util.HashMap;
  91 +import java.util.Map;
  92 +import java.util.Map.Entry;
  93 +
  94 +import org.bytedeco.ffmpeg.avcodec.AVCodec;
  95 +import org.bytedeco.ffmpeg.avcodec.AVCodecContext;
  96 +import org.bytedeco.ffmpeg.avcodec.AVCodecParameters;
  97 +import org.bytedeco.ffmpeg.avcodec.AVPacket;
  98 +import org.bytedeco.ffmpeg.avformat.AVFormatContext;
  99 +import org.bytedeco.ffmpeg.avformat.AVIOContext;
  100 +import org.bytedeco.ffmpeg.avformat.AVInputFormat;
  101 +import org.bytedeco.ffmpeg.avformat.AVStream;
  102 +import org.bytedeco.ffmpeg.avformat.Read_packet_Pointer_BytePointer_int;
  103 +import org.bytedeco.ffmpeg.avformat.Seek_Pointer_long_int;
  104 +import org.bytedeco.ffmpeg.avutil.AVDictionary;
  105 +import org.bytedeco.ffmpeg.avutil.AVDictionaryEntry;
  106 +import org.bytedeco.ffmpeg.avutil.AVFrame;
  107 +import org.bytedeco.ffmpeg.avutil.AVRational;
  108 +import org.bytedeco.ffmpeg.swresample.SwrContext;
  109 +import org.bytedeco.ffmpeg.swscale.SwsContext;
  110 +import org.bytedeco.javacpp.BytePointer;
  111 +import org.bytedeco.javacpp.DoublePointer;
  112 +import org.bytedeco.javacpp.IntPointer;
  113 +import org.bytedeco.javacpp.Loader;
  114 +import org.bytedeco.javacpp.Pointer;
  115 +import org.bytedeco.javacpp.PointerPointer;
  116 +import org.bytedeco.javacpp.PointerScope;
  117 +
  118 +public class FFmpegFrameGrabber extends FrameGrabber {
  119 +
  120 + public static String[] getDeviceDescriptions() throws Exception {
  121 + tryLoad();
  122 + throw new UnsupportedOperationException("Device enumeration not support by FFmpeg.");
  123 + }
  124 +
  125 + public static FFmpegFrameGrabber createDefault(File deviceFile) throws Exception {
  126 + return new FFmpegFrameGrabber(deviceFile);
  127 + }
  128 +
  129 + public static FFmpegFrameGrabber createDefault(String devicePath) throws Exception {
  130 + return new FFmpegFrameGrabber(devicePath);
  131 + }
  132 +
  133 + public static FFmpegFrameGrabber createDefault(int deviceNumber) throws Exception {
  134 + throw new Exception(FFmpegFrameGrabber.class + " does not support device numbers.");
  135 + }
  136 +
  137 + private static Exception loadingException = null;
  138 +
  139 + public static void tryLoad() throws Exception {
  140 + if (loadingException != null) {
  141 + throw loadingException;
  142 + } else {
  143 + try {
  144 + Loader.load(org.bytedeco.ffmpeg.global.avutil.class);
  145 + Loader.load(org.bytedeco.ffmpeg.global.swresample.class);
  146 + Loader.load(org.bytedeco.ffmpeg.global.avcodec.class);
  147 + Loader.load(org.bytedeco.ffmpeg.global.avformat.class);
  148 + Loader.load(org.bytedeco.ffmpeg.global.swscale.class);
  149 +
  150 + // Register all formats and codecs
  151 + av_jni_set_java_vm(Loader.getJavaVM(), null);
  152 + avcodec_register_all();
  153 + av_register_all();
  154 + avformat_network_init();
  155 +
  156 + Loader.load(org.bytedeco.ffmpeg.global.avdevice.class);
  157 + avdevice_register_all();
  158 + } catch (Throwable t) {
  159 + if (t instanceof Exception) {
  160 + throw loadingException = (Exception) t;
  161 + } else {
  162 + throw loadingException = new Exception("Failed to load " + FFmpegFrameGrabber.class, t);
  163 + }
  164 + }
  165 + }
  166 + }
  167 +
  168 + static {
  169 + try {
  170 + tryLoad();
  171 + FFmpegLockCallback.init();
  172 + } catch (Exception ex) {
  173 + }
  174 + }
  175 +
  176 + public FFmpegFrameGrabber(File file) {
  177 + this(file.getAbsolutePath());
  178 + }
  179 +
  180 + public FFmpegFrameGrabber(String filename) {
  181 + this.filename = filename;
  182 + this.pixelFormat = AV_PIX_FMT_NONE;
  183 + this.sampleFormat = AV_SAMPLE_FMT_NONE;
  184 + }
  185 +
  186 + /**
  187 + * Calls {@code FFmpegFrameGrabber(inputStream, Integer.MAX_VALUE - 8)} so that
  188 + * the whole input stream is seekable.
  189 + */
  190 + public FFmpegFrameGrabber(InputStream inputStream) {
  191 + this(inputStream, Integer.MAX_VALUE - 8);
  192 + }
  193 +
  194 + public FFmpegFrameGrabber(InputStream inputStream, int maximumSize) {
  195 + this.inputStream = inputStream;
  196 + this.closeInputStream = true;
  197 + this.pixelFormat = AV_PIX_FMT_NONE;
  198 + this.sampleFormat = AV_SAMPLE_FMT_NONE;
  199 + this.maximumSize = maximumSize;
  200 + }
  201 +
  202 + public void release() throws Exception {
  203 + synchronized (org.bytedeco.ffmpeg.global.avcodec.class) {
  204 + releaseUnsafe();
  205 + }
  206 + }
  207 +
  208 + public void releaseUnsafe() throws Exception {
  209 + if (pkt != null && pkt2 != null) {
  210 + if (pkt2.size() > 0) {
  211 + av_packet_unref(pkt);
  212 + }
  213 + pkt = pkt2 = null;
  214 + }
  215 +
  216 + // Free the RGB image
  217 + if (image_ptr != null) {
  218 + for (int i = 0; i < image_ptr.length; i++) {
  219 + av_free(image_ptr[i]);
  220 + }
  221 + image_ptr = null;
  222 + }
  223 + if (picture_rgb != null) {
  224 + av_frame_free(picture_rgb);
  225 + picture_rgb = null;
  226 + }
  227 +
  228 + // Free the native format picture frame
  229 + if (picture != null) {
  230 + av_frame_free(picture);
  231 + picture = null;
  232 + }
  233 +
  234 + // Close the video codec
  235 + if (video_c != null) {
  236 + avcodec_free_context(video_c);
  237 + video_c = null;
  238 + }
  239 +
  240 + // Free the audio samples frame
  241 + if (samples_frame != null) {
  242 + av_frame_free(samples_frame);
  243 + samples_frame = null;
  244 + }
  245 +
  246 + // Close the audio codec
  247 + if (audio_c != null) {
  248 + avcodec_free_context(audio_c);
  249 + audio_c = null;
  250 + }
  251 +
  252 + // Close the video file
  253 + if (inputStream == null && oc != null && !oc.isNull()) {
  254 + avformat_close_input(oc);
  255 + oc = null;
  256 + }
  257 +
  258 + if (img_convert_ctx != null) {
  259 + sws_freeContext(img_convert_ctx);
  260 + img_convert_ctx = null;
  261 + }
  262 +
  263 + if (samples_ptr_out != null) {
  264 + for (int i = 0; i < samples_ptr_out.length; i++) {
  265 + av_free(samples_ptr_out[i].position(0));
  266 + }
  267 + samples_ptr_out = null;
  268 + samples_buf_out = null;
  269 + }
  270 +
  271 + if (samples_convert_ctx != null) {
  272 + swr_free(samples_convert_ctx);
  273 + samples_convert_ctx = null;
  274 + }
  275 +
  276 + got_frame = null;
  277 + frameGrabbed = false;
  278 + frame = null;
  279 + timestamp = 0;
  280 + frameNumber = 0;
  281 +
  282 + if (inputStream != null) {
  283 + try {
  284 + if (oc == null) {
  285 + // when called a second time
  286 + if (closeInputStream) {
  287 + inputStream.close();
  288 + }
  289 + } else {
  290 + inputStream.reset();
  291 + }
  292 + } catch (IOException ex) {
  293 + throw new Exception("Error on InputStream.close(): ", ex);
  294 + } finally {
  295 + inputStreams.remove(oc);
  296 + if (avio != null) {
  297 + if (avio.buffer() != null) {
  298 + av_free(avio.buffer());
  299 + avio.buffer(null);
  300 + }
  301 + av_free(avio);
  302 + avio = null;
  303 + }
  304 + if (oc != null) {
  305 + avformat_free_context(oc);
  306 + oc = null;
  307 + }
  308 + }
  309 + }
  310 + }
  311 +
  312 + @Override
  313 + protected void finalize() throws Throwable {
  314 + super.finalize();
  315 + release();
  316 + }
  317 +
  318 + static Map<Pointer, InputStream> inputStreams = Collections.synchronizedMap(new HashMap<Pointer, InputStream>());
  319 +
  320 + static class ReadCallback extends Read_packet_Pointer_BytePointer_int {
  321 + @Override
  322 + public int call(Pointer opaque, BytePointer buf, int buf_size) {
  323 + try {
  324 + byte[] b = new byte[buf_size];
  325 + InputStream is = inputStreams.get(opaque);
  326 + int size = is.read(b, 0, buf_size);
  327 + if (size < 0) {
  328 + return 0;
  329 + } else {
  330 + buf.put(b, 0, size);
  331 + return size;
  332 + }
  333 + } catch (Throwable t) {
  334 + System.err.println("Error on InputStream.read(): " + t);
  335 + return -1;
  336 + }
  337 + }
  338 + }
  339 +
  340 + static class SeekCallback extends Seek_Pointer_long_int {
  341 + @Override
  342 + public long call(Pointer opaque, long offset, int whence) {
  343 + try {
  344 + InputStream is = inputStreams.get(opaque);
  345 + long size = 0;
  346 + switch (whence) {
  347 + case 0:
  348 + is.reset();
  349 + break; // SEEK_SET
  350 + case 1:
  351 + break; // SEEK_CUR
  352 + case 2: // SEEK_END
  353 + is.reset();
  354 + while (true) {
  355 + long n = is.skip(Long.MAX_VALUE);
  356 + if (n == 0)
  357 + break;
  358 + size += n;
  359 + }
  360 + offset += size;
  361 + is.reset();
  362 + break;
  363 + case AVSEEK_SIZE:
  364 + long remaining = 0;
  365 + while (true) {
  366 + long n = is.skip(Long.MAX_VALUE);
  367 + if (n == 0)
  368 + break;
  369 + remaining += n;
  370 + }
  371 + is.reset();
  372 + while (true) {
  373 + long n = is.skip(Long.MAX_VALUE);
  374 + if (n == 0)
  375 + break;
  376 + size += n;
  377 + }
  378 + offset = size - remaining;
  379 + is.reset();
  380 + break;
  381 + default:
  382 + return -1;
  383 + }
  384 + long remaining = offset;
  385 + while (remaining > 0) {
  386 + long skipped = is.skip(remaining);
  387 + if (skipped == 0)
  388 + break; // end of the stream
  389 + remaining -= skipped;
  390 + }
  391 + return whence == AVSEEK_SIZE ? size : 0;
  392 + } catch (Throwable t) {
  393 + System.err.println("Error on InputStream.reset() or skip(): " + t);
  394 + return -1;
  395 + }
  396 + }
  397 + }
  398 +
  399 + static ReadCallback readCallback = new ReadCallback();
  400 + static SeekCallback seekCallback = new SeekCallback();
  401 + static {
  402 + PointerScope s = PointerScope.getInnerScope();
  403 + if (s != null) {
  404 + s.detach(readCallback);
  405 + s.detach(seekCallback);
  406 + }
  407 + }
  408 +
  409 + private InputStream inputStream;
  410 + private boolean closeInputStream;
  411 + private int maximumSize;
  412 + private AVIOContext avio;
  413 + private String filename;
  414 + private AVFormatContext oc;
  415 + private AVStream video_st, audio_st;
  416 + private AVCodecContext video_c, audio_c;
  417 + private AVFrame picture, picture_rgb;
  418 + private BytePointer[] image_ptr;
  419 + private Buffer[] image_buf;
  420 + private AVFrame samples_frame;
  421 + private BytePointer[] samples_ptr;
  422 + private Buffer[] samples_buf;
  423 + private BytePointer[] samples_ptr_out;
  424 + private Buffer[] samples_buf_out;
  425 + private AVPacket pkt, pkt2;
  426 + private int sizeof_pkt;
  427 + private int[] got_frame;
  428 + private SwsContext img_convert_ctx;
  429 + private SwrContext samples_convert_ctx;
  430 + private int samples_channels, samples_format, samples_rate;
  431 + private boolean frameGrabbed;
  432 + private Frame frame;
  433 +
  434 + public boolean isCloseInputStream() {
  435 + return closeInputStream;
  436 + }
  437 +
  438 + public void setCloseInputStream(boolean closeInputStream) {
  439 + this.closeInputStream = closeInputStream;
  440 + }
  441 +
  442 + /**
  443 + * Is there a video stream?
  444 + *
  445 + * @return {@code video_st!=null;}
  446 + */
  447 + public boolean hasVideo() {
  448 + return video_st != null;
  449 + }
  450 +
  451 + /**
  452 + * Is there an audio stream?
  453 + *
  454 + * @return {@code audio_st!=null;}
  455 + */
  456 + public boolean hasAudio() {
  457 + return audio_st != null;
  458 + }
  459 +
  460 + @Override
  461 + public double getGamma() {
  462 + // default to a gamma of 2.2 for cheap Webcams, DV cameras, etc.
  463 + if (gamma == 0.0) {
  464 + return 2.2;
  465 + } else {
  466 + return gamma;
  467 + }
  468 + }
  469 +
  470 + @Override
  471 + public String getFormat() {
  472 + if (oc == null) {
  473 + return super.getFormat();
  474 + } else {
  475 + return oc.iformat().name().getString();
  476 + }
  477 + }
  478 +
  479 + @Override
  480 + public int getImageWidth() {
  481 + return imageWidth > 0 || video_c == null ? super.getImageWidth() : video_c.width();
  482 + }
  483 +
  484 + @Override
  485 + public int getImageHeight() {
  486 + return imageHeight > 0 || video_c == null ? super.getImageHeight() : video_c.height();
  487 + }
  488 +
  489 + @Override
  490 + public int getAudioChannels() {
  491 + return audioChannels > 0 || audio_c == null ? super.getAudioChannels() : audio_c.channels();
  492 + }
  493 +
  494 + @Override
  495 + public int getPixelFormat() {
  496 + if (imageMode == ImageMode.COLOR || imageMode == ImageMode.GRAY) {
  497 + if (pixelFormat == AV_PIX_FMT_NONE) {
  498 + return imageMode == ImageMode.COLOR ? AV_PIX_FMT_BGR24 : AV_PIX_FMT_GRAY8;
  499 + } else {
  500 + return pixelFormat;
  501 + }
  502 + } else if (video_c != null) { // RAW
  503 + return video_c.pix_fmt();
  504 + } else {
  505 + return super.getPixelFormat();
  506 + }
  507 + }
  508 +
  509 + @Override
  510 + public int getVideoCodec() {
  511 + return video_c == null ? super.getVideoCodec() : video_c.codec_id();
  512 + }
  513 +
  514 + @Override
  515 + public int getVideoBitrate() {
  516 + return video_c == null ? super.getVideoBitrate() : (int) video_c.bit_rate();
  517 + }
  518 +
  519 + @Override
  520 + public double getAspectRatio() {
  521 + if (video_st == null) {
  522 + return super.getAspectRatio();
  523 + } else {
  524 + AVRational r = av_guess_sample_aspect_ratio(oc, video_st, picture);
  525 + double a = (double) r.num() / r.den();
  526 + return a == 0.0 ? 1.0 : a;
  527 + }
  528 + }
  529 +
  530 + /** Returns {@link #getVideoFrameRate()} */
  531 + @Override
  532 + public double getFrameRate() {
  533 + return getVideoFrameRate();
  534 + }
  535 +
  536 + /**
  537 + * Estimation of audio frames per second
  538 + *
  539 + * @return (double) getSampleRate()) / samples_frame.nb_samples() if
  540 + * samples_frame.nb_samples() is not zero, otherwise return 0
  541 + */
  542 + public double getAudioFrameRate() {
  543 + if (audio_st == null) {
  544 + return 0.0;
  545 + } else {
  546 + if (samples_frame == null || samples_frame.nb_samples() == 0) {
  547 + try {
  548 + grabFrame(true, false, false, false);
  549 + frameGrabbed = true;
  550 + } catch (Exception e) {
  551 + return 0.0;
  552 + }
  553 + }
  554 + if (samples_frame != null || samples_frame.nb_samples() != 0)
  555 + return ((double) getSampleRate()) / samples_frame.nb_samples();
  556 + else
  557 + return 0.0;
  558 +
  559 + }
  560 + }
  561 +
  562 + public double getVideoFrameRate() {
  563 + if (video_st == null) {
  564 + return super.getFrameRate();
  565 + } else {
  566 + AVRational r = video_st.avg_frame_rate();
  567 + if (r.num() == 0 && r.den() == 0) {
  568 + r = video_st.r_frame_rate();
  569 + }
  570 + return (double) r.num() / r.den();
  571 + }
  572 + }
  573 +
  574 + @Override
  575 + public int getAudioCodec() {
  576 + return audio_c == null ? super.getAudioCodec() : audio_c.codec_id();
  577 + }
  578 +
  579 + @Override
  580 + public int getAudioBitrate() {
  581 + return audio_c == null ? super.getAudioBitrate() : (int) audio_c.bit_rate();
  582 + }
  583 +
  584 + @Override
  585 + public int getSampleFormat() {
  586 + if (sampleMode == SampleMode.SHORT || sampleMode == SampleMode.FLOAT) {
  587 + if (sampleFormat == AV_SAMPLE_FMT_NONE) {
  588 + return sampleMode == SampleMode.SHORT ? AV_SAMPLE_FMT_S16 : AV_SAMPLE_FMT_FLT;
  589 + } else {
  590 + return sampleFormat;
  591 + }
  592 + } else if (audio_c != null) { // RAW
  593 + return audio_c.sample_fmt();
  594 + } else {
  595 + return super.getSampleFormat();
  596 + }
  597 + }
  598 +
  599 + @Override
  600 + public int getSampleRate() {
  601 + return sampleRate > 0 || audio_c == null ? super.getSampleRate() : audio_c.sample_rate();
  602 + }
  603 +
  604 + @Override
  605 + public Map<String, String> getMetadata() {
  606 + if (oc == null) {
  607 + return super.getMetadata();
  608 + }
  609 + AVDictionaryEntry entry = null;
  610 + Map<String, String> metadata = new HashMap<String, String>();
  611 + while ((entry = av_dict_get(oc.metadata(), "", entry, AV_DICT_IGNORE_SUFFIX)) != null) {
  612 + metadata.put(entry.key().getString(), entry.value().getString());
  613 + }
  614 + return metadata;
  615 + }
  616 +
  617 + @Override
  618 + public Map<String, String> getVideoMetadata() {
  619 + if (video_st == null) {
  620 + return super.getVideoMetadata();
  621 + }
  622 + AVDictionaryEntry entry = null;
  623 + Map<String, String> metadata = new HashMap<String, String>();
  624 + while ((entry = av_dict_get(video_st.metadata(), "", entry, AV_DICT_IGNORE_SUFFIX)) != null) {
  625 + metadata.put(entry.key().getString(), entry.value().getString());
  626 + }
  627 + return metadata;
  628 + }
  629 +
  630 + @Override
  631 + public Map<String, String> getAudioMetadata() {
  632 + if (audio_st == null) {
  633 + return super.getAudioMetadata();
  634 + }
  635 + AVDictionaryEntry entry = null;
  636 + Map<String, String> metadata = new HashMap<String, String>();
  637 + while ((entry = av_dict_get(audio_st.metadata(), "", entry, AV_DICT_IGNORE_SUFFIX)) != null) {
  638 + metadata.put(entry.key().getString(), entry.value().getString());
  639 + }
  640 + return metadata;
  641 + }
  642 +
  643 + @Override
  644 + public String getMetadata(String key) {
  645 + if (oc == null) {
  646 + return super.getMetadata(key);
  647 + }
  648 + AVDictionaryEntry entry = av_dict_get(oc.metadata(), key, null, 0);
  649 + return entry == null || entry.value() == null ? null : entry.value().getString();
  650 + }
  651 +
  652 + @Override
  653 + public String getVideoMetadata(String key) {
  654 + if (video_st == null) {
  655 + return super.getVideoMetadata(key);
  656 + }
  657 + AVDictionaryEntry entry = av_dict_get(video_st.metadata(), key, null, 0);
  658 + return entry == null || entry.value() == null ? null : entry.value().getString();
  659 + }
  660 +
  661 + @Override
  662 + public String getAudioMetadata(String key) {
  663 + if (audio_st == null) {
  664 + return super.getAudioMetadata(key);
  665 + }
  666 + AVDictionaryEntry entry = av_dict_get(audio_st.metadata(), key, null, 0);
  667 + return entry == null || entry.value() == null ? null : entry.value().getString();
  668 + }
  669 +
  670 + /**
  671 + * default override of super.setFrameNumber implies setting of a frame close to
  672 + * a video frame having that number
  673 + */
  674 + @Override
  675 + public void setFrameNumber(int frameNumber) throws Exception {
  676 + if (hasVideo())
  677 + setTimestamp(Math.round(1000000L * frameNumber / getFrameRate()));
  678 + else
  679 + super.frameNumber = frameNumber;
  680 + }
  681 +
  682 + /**
  683 + * if there is video stream tries to seek to video frame with corresponding
  684 + * timestamp otherwise sets super.frameNumber only because frameRate==0 if there
  685 + * is no video stream
  686 + */
  687 + public void setVideoFrameNumber(int frameNumber) throws Exception {
  688 + // best guess, AVSEEK_FLAG_FRAME has not been implemented in FFmpeg...
  689 + if (hasVideo())
  690 + setVideoTimestamp(Math.round(1000000L * frameNumber / getFrameRate()));
  691 + else
  692 + super.frameNumber = frameNumber;
  693 + }
  694 +
  695 + /**
  696 + * if there is audio stream tries to seek to audio frame with corresponding
  697 + * timestamp ignoring otherwise
  698 + */
  699 + public void setAudioFrameNumber(int frameNumber) throws Exception {
  700 + // best guess, AVSEEK_FLAG_FRAME has not been implemented in FFmpeg...
  701 + if (hasAudio())
  702 + setAudioTimestamp(Math.round(1000000L * frameNumber / getAudioFrameRate()));
  703 +
  704 + }
  705 +
  706 + /**
  707 + * setTimestamp without checking frame content (using old code used in JavaCV
  708 + * versions prior to 1.4.1)
  709 + */
  710 + @Override
  711 + public void setTimestamp(long timestamp) throws Exception {
  712 + setTimestamp(timestamp, false);
  713 + }
  714 +
  715 + /**
  716 + * setTimestamp with possibility to select between old quick seek code or new
  717 + * code doing check of frame content. The frame check can be useful with
  718 + * corrupted files, when seeking may end up with an empty frame not containing
  719 + * video nor audio
  720 + */
  721 + public void setTimestamp(long timestamp, boolean checkFrame) throws Exception {
  722 + setTimestamp(timestamp, checkFrame ? EnumSet.of(Frame.Type.VIDEO, Frame.Type.AUDIO) : null);
  723 + }
  724 +
  725 + /** setTimestamp with resulting video frame type if there is a video stream */
  726 + public void setVideoTimestamp(long timestamp) throws Exception {
  727 + setTimestamp(timestamp, EnumSet.of(Frame.Type.VIDEO));
  728 + }
  729 +
  730 + /** setTimestamp with resulting audio frame type if there is an audio stream */
  731 + public void setAudioTimestamp(long timestamp) throws Exception {
  732 + setTimestamp(timestamp, EnumSet.of(Frame.Type.AUDIO));
  733 + }
  734 +
  735 + /**
  736 + * setTimestamp with a priority the resulting frame should be: video
  737 + * (frameTypesToSeek contains only Frame.Type.VIDEO), audio (frameTypesToSeek
  738 + * contains only Frame.Type.AUDIO), or any (frameTypesToSeek contains both)
  739 + */
  740 + private void setTimestamp(long timestamp, EnumSet<Frame.Type> frameTypesToSeek) throws Exception {
  741 + int ret;
  742 + if (oc == null) {
  743 + super.setTimestamp(timestamp);
  744 + } else {
  745 + timestamp = timestamp * AV_TIME_BASE / 1000000L;
  746 + /* add the stream start time */
  747 + if (oc.start_time() != AV_NOPTS_VALUE) {
  748 + timestamp += oc.start_time();
  749 + }
  750 + if ((ret = avformat_seek_file(oc, -1, Long.MIN_VALUE, timestamp, Long.MAX_VALUE,
  751 + AVSEEK_FLAG_BACKWARD)) < 0) {
  752 + throw new Exception(
  753 + "avformat_seek_file() error " + ret + ": Could not seek file to timestamp " + timestamp + ".");
  754 + }
  755 + if (video_c != null) {
  756 + avcodec_flush_buffers(video_c);
  757 + }
  758 + if (audio_c != null) {
  759 + avcodec_flush_buffers(audio_c);
  760 + }
  761 + if (pkt2.size() > 0) {
  762 + pkt2.size(0);
  763 + av_packet_unref(pkt);
  764 + }
  765 + /*
  766 + * After the call of ffmpeg's avformat_seek_file(...) with the flag set to
  767 + * AVSEEK_FLAG_BACKWARD the decoding position should be located before the
  768 + * requested timestamp in a closest position from which all the active streams
  769 + * can be decoded successfully. The following seeking consists of two stages: 1.
  770 + * Grab frames till the frame corresponding to that "closest" position (the
  771 + * first frame containing decoded data).
  772 + *
  773 + * 2. Grab frames till the desired timestamp is reached. The number of steps is
  774 + * restricted by doubled estimation of frames between that "closest" position
  775 + * and the desired position.
  776 + *
  777 + * frameTypesToSeek parameter sets the preferred type of frames to seek. It can
  778 + * be chosen from three possible types: VIDEO, AUDIO or any of them. The setting
  779 + * means only a preference in the type. That is, if VIDEO or AUDIO is specified
  780 + * but the file does not have video or audio stream - any type will be used
  781 + * instead.
  782 + *
  783 + *
  784 + * TODO Sometimes the ffmpeg's avformat_seek_file(...) function brings us not to
  785 + * a position before the desired but few frames after.... What can be a the
  786 + * solution in this case if we really need a frame-precision seek? Probably we
  787 + * may try to request even earlier timestamp and look if this will bring us
  788 + * before the desired position.
  789 + *
  790 + */
  791 +
  792 + if (frameTypesToSeek != null) { // new code providing check of frame content while seeking to the timestamp
  793 + boolean has_video = hasVideo();
  794 + boolean has_audio = hasAudio();
  795 +
  796 + if (has_video || has_audio) {
  797 + if ((frameTypesToSeek.contains(Frame.Type.VIDEO) && !has_video)
  798 + || (frameTypesToSeek.contains(Frame.Type.AUDIO) && !has_audio))
  799 + frameTypesToSeek = EnumSet.of(Frame.Type.VIDEO, Frame.Type.AUDIO);
  800 +
  801 + long initialSeekPosition = Long.MIN_VALUE;
  802 + long maxSeekSteps = 0;
  803 + long count = 0;
  804 + Frame seekFrame = null;
  805 +
  806 + while (count++ < 1000) { // seek to a first frame containing video or audio after
  807 + // avformat_seek_file(...)
  808 + seekFrame = grabFrame(true, true, false, false);
  809 + if (seekFrame == null)
  810 + return; // is it better to throw NullPointerException?
  811 + EnumSet<Frame.Type> frameTypes = seekFrame.getTypes();
  812 + frameTypes.retainAll(frameTypesToSeek);
  813 + if (!frameTypes.isEmpty()) {
  814 + initialSeekPosition = seekFrame.timestamp;
  815 + // the position closest to the requested timestamp from which it can be reached
  816 + // by sequential grabFrame calls
  817 + break;
  818 + }
  819 + }
  820 + if (has_video && this.getFrameRate() > 0) {
  821 + // estimation of video frame duration
  822 + double deltaTimeStamp = 1000000.0 / this.getFrameRate();
  823 + if (initialSeekPosition < timestamp - deltaTimeStamp / 2)
  824 + maxSeekSteps = (long) (10 * (timestamp - initialSeekPosition) / deltaTimeStamp);
  825 + } else if (has_audio && this.getAudioFrameRate() > 0) {
  826 + // estimation of audio frame duration
  827 + double deltaTimeStamp = 1000000.0 / this.getAudioFrameRate();
  828 + if (initialSeekPosition < timestamp - deltaTimeStamp / 2)
  829 + maxSeekSteps = (long) (10 * (timestamp - initialSeekPosition) / deltaTimeStamp);
  830 + } else
  831 + // zero frameRate
  832 + if (initialSeekPosition < timestamp - 1L)
  833 + maxSeekSteps = 1000;
  834 +
  835 + count = 0;
  836 + while (count < maxSeekSteps) {
  837 + seekFrame = grabFrame(true, true, false, false);
  838 + if (seekFrame == null)
  839 + return; // is it better to throw NullPointerException?
  840 + EnumSet<Frame.Type> frameTypes = seekFrame.getTypes();
  841 + frameTypes.retainAll(frameTypesToSeek);
  842 + if (!frameTypes.isEmpty()) {
  843 + count++;
  844 + if (this.timestamp >= timestamp - 1)
  845 + break;
  846 + }
  847 + }
  848 +
  849 + frameGrabbed = true;
  850 + }
  851 + } else { // old quick seeking code used in JavaCV versions prior to 1.4.1
  852 + /*
  853 + * comparing to timestamp +/- 1 avoids rouding issues for framerates which are
  854 + * no proper divisors of 1000000, e.g. where av_frame_get_best_effort_timestamp
  855 + * in grabFrame sets this.timestamp to ...666 and the given timestamp has been
  856 + * rounded to ...667 (or vice versa)
  857 + */
  858 + int count = 0; // prevent infinite loops with corrupted files
  859 + while (this.timestamp > timestamp + 1 && grabFrame(true, true, false, false) != null
  860 + && count++ < 1000) {
  861 + // flush frames if seeking backwards
  862 + }
  863 + count = 0;
  864 + while (this.timestamp < timestamp - 1 && grabFrame(true, true, false, false) != null
  865 + && count++ < 1000) {
  866 + // decode up to the desired frame
  867 + }
  868 + frameGrabbed = true;
  869 + }
  870 + }
  871 + }
  872 +
  873 + /** Returns {@link #getLengthInVideoFrames()} */
  874 + @Override
  875 + public int getLengthInFrames() {
  876 + // best guess...
  877 + return getLengthInVideoFrames();
  878 + }
  879 +
  880 + @Override
  881 + public long getLengthInTime() {
  882 + return oc.duration() * 1000000L / AV_TIME_BASE;
  883 + }
  884 +
  885 + /**
  886 + * Returns
  887 + * {@code (int) Math.round(getLengthInTime() * getFrameRate() / 1000000L)},
  888 + * which is an approximation in general.
  889 + */
  890 + public int getLengthInVideoFrames() {
  891 + // best guess...
  892 + return (int) Math.round(getLengthInTime() * getFrameRate() / 1000000L);
  893 + }
  894 +
  895 + public int getLengthInAudioFrames() {
  896 + // best guess...
  897 + double afr = getAudioFrameRate();
  898 + if (afr > 0)
  899 + return (int) (getLengthInTime() * afr / 1000000L);
  900 + else
  901 + return 0;
  902 + }
  903 +
  904 + public AVFormatContext getFormatContext() {
  905 + return oc;
  906 + }
  907 +
  908 + public void start(String streamCode) throws Exception {
  909 + synchronized (org.bytedeco.ffmpeg.global.avcodec.class) {
  910 + startUnsafe(streamCode);
  911 + }
  912 + }
  913 +
  914 + public void startUnsafe(String streamCode) throws Exception {
  915 + if (oc != null && !oc.isNull()) {
  916 + throw new Exception("start() has already been called: Call stop() before calling start() again.");
  917 + }
  918 +
  919 + int ret;
  920 + img_convert_ctx = null;
  921 + oc = new AVFormatContext(null);
  922 + video_c = null;
  923 + audio_c = null;
  924 + pkt = new AVPacket();
  925 + pkt2 = new AVPacket();
  926 + sizeof_pkt = pkt.sizeof();
  927 + got_frame = new int[1];
  928 + frameGrabbed = false;
  929 + frame = new Frame();
  930 + timestamp = 0;
  931 + frameNumber = 0;
  932 +
  933 + pkt2.size(0);
  934 +
  935 + // Open video file
  936 + AVInputFormat f = null;
  937 + if (format != null && format.length() > 0) {
  938 + if ((f = av_find_input_format(format)) == null) {
  939 + throw new Exception("av_find_input_format() error: Could not find input format \"" + format + "\".");
  940 + }
  941 + }
  942 + AVDictionary options = new AVDictionary(null);
  943 + if (frameRate > 0) {
  944 + AVRational r = av_d2q(frameRate, 1001000);
  945 + av_dict_set(options, "framerate", r.num() + "/" + r.den(), 0);
  946 + }
  947 + if (pixelFormat >= 0) {
  948 + av_dict_set(options, "pixel_format", av_get_pix_fmt_name(pixelFormat).getString(), 0);
  949 + } else if (imageMode != ImageMode.RAW) {
  950 + av_dict_set(options, "pixel_format", imageMode == ImageMode.COLOR ? "bgr24" : "gray8", 0);
  951 + }
  952 + if (imageWidth > 0 && imageHeight > 0) {
  953 + av_dict_set(options, "video_size", imageWidth + "x" + imageHeight, 0);
  954 + }
  955 + if (sampleRate > 0) {
  956 + av_dict_set(options, "sample_rate", "" + sampleRate, 0);
  957 + }
  958 + if (audioChannels > 0) {
  959 + av_dict_set(options, "channels", "" + audioChannels, 0);
  960 + }
  961 + for (Entry<String, String> e : this.options.entrySet()) {
  962 + av_dict_set(options, e.getKey(), e.getValue(), 0);
  963 + }
  964 + if (inputStream != null) {
  965 + if (!inputStream.markSupported()) {
  966 + inputStream = new BufferedInputStream(inputStream);
  967 + }
  968 + inputStream.mark(maximumSize);
  969 + oc = avformat_alloc_context();
  970 + avio = avio_alloc_context(new BytePointer(av_malloc(4096)), 4096, 0, oc, readCallback, null, seekCallback);
  971 + oc.pb(avio);
  972 +
  973 + filename = inputStream.toString();
  974 + inputStreams.put(oc, inputStream);
  975 + }
  976 + if ((ret = avformat_open_input(oc, filename, f, options)) < 0) {
  977 + av_dict_set(options, "pixel_format", null, 0);
  978 + if ((ret = avformat_open_input(oc, filename, f, options)) < 0) {
  979 + throw new Exception("avformat_open_input() error " + ret + ": Could not open input \"" + filename
  980 + + "\". (Has setFormat() been called?)");
  981 + }
  982 + }
  983 + av_dict_free(options);
  984 +
  985 + oc.max_delay(maxDelay);
  986 + // Retrieve stream information
  987 + // 限制avformat_find_stream_info接口内部读取的最大数据量
  988 + oc.probesize(Integer.parseInt(streamCode));
  989 + // 设置avformat_find_stream_info这个函数的持续时长,超过这个时间不结束也会结束
  990 + oc.max_analyze_duration(5 * AV_TIME_BASE);
  991 + // 将avformat_find_stream_info内部读取的数据包不放入AVFormatContext的缓冲区packet_buffer中
  992 +// oc.flags(AVFormatContext.AVFMT_FLAG_NOBUFFER);
  993 +
  994 + AVDictionary optionOut = new AVDictionary(null);
  995 + if ((ret = avformat_find_stream_info(oc, (PointerPointer) null)) < 0) {
  996 + throw new Exception("avformat_find_stream_info() error " + ret + ": Could not find stream information.");
  997 + }
  998 + if (av_log_get_level() >= AV_LOG_INFO) {
  999 + // Dump information about file onto standard error
  1000 + av_dump_format(oc, 0, filename, 0);
  1001 + }
  1002 +
  1003 + // Find the first video and audio stream, unless the user specified otherwise
  1004 + video_st = audio_st = null;
  1005 + AVCodecParameters video_par = null, audio_par = null;
  1006 + int nb_streams = oc.nb_streams();
  1007 + for (int i = 0; i < nb_streams; i++) {
  1008 + AVStream st = oc.streams(i);
  1009 + // Get a pointer to the codec context for the video or audio stream
  1010 + AVCodecParameters par = st.codecpar();
  1011 + if (video_st == null && par.codec_type() == AVMEDIA_TYPE_VIDEO && (videoStream < 0 || videoStream == i)) {
  1012 + video_st = st;
  1013 + video_par = par;
  1014 + videoStream = i;
  1015 + } else if (audio_st == null && par.codec_type() == AVMEDIA_TYPE_AUDIO
  1016 + && (audioStream < 0 || audioStream == i)) {
  1017 + audio_st = st;
  1018 + audio_par = par;
  1019 + audioStream = i;
  1020 + }
  1021 + }
  1022 + if (video_st == null && audio_st == null) {
  1023 + throw new Exception("Did not find a video or audio stream inside \"" + filename + "\" for videoStream == "
  1024 + + videoStream + " and audioStream == " + audioStream + ".");
  1025 + }
  1026 +
  1027 + if (video_st != null) {
  1028 + // Find the decoder for the video stream
  1029 + AVCodec codec = avcodec_find_decoder_by_name(videoCodecName);
  1030 + if (codec == null) {
  1031 + codec = avcodec_find_decoder(video_par.codec_id());
  1032 + }
  1033 + if (codec == null) {
  1034 + throw new Exception("avcodec_find_decoder() error: Unsupported video format or codec not found: "
  1035 + + video_par.codec_id() + ".");
  1036 + }
  1037 +
  1038 + /* Allocate a codec context for the decoder */
  1039 + if ((video_c = avcodec_alloc_context3(codec)) == null) {
  1040 + throw new Exception("avcodec_alloc_context3() error: Could not allocate video decoding context.");
  1041 + }
  1042 +
  1043 + /* copy the stream parameters from the muxer */
  1044 + if ((ret = avcodec_parameters_to_context(video_c, video_st.codecpar())) < 0) {
  1045 + releaseUnsafe();
  1046 + throw new Exception(
  1047 + "avcodec_parameters_to_context() error: Could not copy the video stream parameters.");
  1048 + }
  1049 +
  1050 + options = new AVDictionary(null);
  1051 + for (Entry<String, String> e : videoOptions.entrySet()) {
  1052 + av_dict_set(options, e.getKey(), e.getValue(), 0);
  1053 + }
  1054 +
  1055 + // Enable multithreading when available
  1056 + video_c.thread_count(0);
  1057 +
  1058 + // Open video codec
  1059 + if ((ret = avcodec_open2(video_c, codec, options)) < 0) {
  1060 + throw new Exception("avcodec_open2() error " + ret + ": Could not open video codec.");
  1061 + }
  1062 + av_dict_free(options);
  1063 +
  1064 + // Hack to correct wrong frame rates that seem to be generated by some codecs
  1065 + if (video_c.time_base().num() > 1000 && video_c.time_base().den() == 1) {
  1066 + video_c.time_base().den(1000);
  1067 + }
  1068 +
  1069 + // Allocate video frame and an AVFrame structure for the RGB image
  1070 + if ((picture = av_frame_alloc()) == null) {
  1071 + throw new Exception("av_frame_alloc() error: Could not allocate raw picture frame.");
  1072 + }
  1073 + if ((picture_rgb = av_frame_alloc()) == null) {
  1074 + throw new Exception("av_frame_alloc() error: Could not allocate RGB picture frame.");
  1075 + }
  1076 +
  1077 + initPictureRGB();
  1078 + }
  1079 +
  1080 + if (audio_st != null) {
  1081 + // Find the decoder for the audio stream
  1082 + AVCodec codec = avcodec_find_decoder_by_name(audioCodecName);
  1083 + if (codec == null) {
  1084 + codec = avcodec_find_decoder(audio_par.codec_id());
  1085 + }
  1086 + if (codec == null) {
  1087 +// throw new Exception("avcodec_find_decoder() error: Unsupported audio format or codec not found: "
  1088 +// + audio_par.codec_id() + ".");
  1089 + } else {
  1090 + /* Allocate a codec context for the decoder */
  1091 + if ((audio_c = avcodec_alloc_context3(codec)) == null) {
  1092 + throw new Exception("avcodec_alloc_context3() error: Could not allocate audio decoding context.");
  1093 + }
  1094 +
  1095 + /* copy the stream parameters from the muxer */
  1096 + if ((ret = avcodec_parameters_to_context(audio_c, audio_st.codecpar())) < 0) {
  1097 + releaseUnsafe();
  1098 + throw new Exception(
  1099 + "avcodec_parameters_to_context() error: Could not copy the audio stream parameters.");
  1100 + }
  1101 +
  1102 + options = new AVDictionary(null);
  1103 + for (Entry<String, String> e : audioOptions.entrySet()) {
  1104 + av_dict_set(options, e.getKey(), e.getValue(), 0);
  1105 + }
  1106 +
  1107 + // Enable multithreading when available
  1108 + audio_c.thread_count(0);
  1109 +
  1110 + // Open audio codec
  1111 + if ((ret = avcodec_open2(audio_c, codec, options)) < 0) {
  1112 + throw new Exception("avcodec_open2() error " + ret + ": Could not open audio codec.");
  1113 + }
  1114 + av_dict_free(options);
  1115 +
  1116 + // Allocate audio samples frame
  1117 + if ((samples_frame = av_frame_alloc()) == null) {
  1118 + throw new Exception("av_frame_alloc() error: Could not allocate audio frame.");
  1119 + }
  1120 + samples_ptr = new BytePointer[] { null };
  1121 + samples_buf = new Buffer[] { null };
  1122 +
  1123 + }
  1124 + }
  1125 + }
  1126 +
  1127 + private void initPictureRGB() {
  1128 + int width = imageWidth > 0 ? imageWidth : video_c.width();
  1129 + int height = imageHeight > 0 ? imageHeight : video_c.height();
  1130 +
  1131 + switch (imageMode) {
  1132 + case COLOR:
  1133 + case GRAY:
  1134 + // If size changes I new allocation is needed -> free the old one.
  1135 + if (image_ptr != null) {
  1136 + // First kill all references, then free it.
  1137 + image_buf = null;
  1138 + BytePointer[] temp = image_ptr;
  1139 + image_ptr = null;
  1140 + av_free(temp[0]);
  1141 + }
  1142 + int fmt = getPixelFormat();
  1143 +
  1144 + // work around bug in swscale: https://trac.ffmpeg.org/ticket/1031
  1145 + int align = 32;
  1146 + int stride = width;
  1147 + for (int i = 1; i <= align; i += i) {
  1148 + stride = (width + (i - 1)) & ~(i - 1);
  1149 + av_image_fill_linesizes(picture_rgb.linesize(), fmt, stride);
  1150 + if ((picture_rgb.linesize(0) & (align - 1)) == 0) {
  1151 + break;
  1152 + }
  1153 + }
  1154 +
  1155 + // Determine required buffer size and allocate buffer
  1156 + int size = av_image_get_buffer_size(fmt, stride, height, 1);
  1157 + image_ptr = new BytePointer[] { new BytePointer(av_malloc(size)).capacity(size) };
  1158 + image_buf = new Buffer[] { image_ptr[0].asBuffer() };
  1159 +
  1160 + // Assign appropriate parts of buffer to image planes in picture_rgb
  1161 + // Note that picture_rgb is an AVFrame, but AVFrame is a superset of AVPicture
  1162 + av_image_fill_arrays(new PointerPointer(picture_rgb), picture_rgb.linesize(), image_ptr[0], fmt, stride,
  1163 + height, 1);
  1164 + picture_rgb.format(fmt);
  1165 + picture_rgb.width(width);
  1166 + picture_rgb.height(height);
  1167 + break;
  1168 +
  1169 + case RAW:
  1170 + image_ptr = new BytePointer[] { null };
  1171 + image_buf = new Buffer[] { null };
  1172 + break;
  1173 +
  1174 + default:
  1175 + assert false;
  1176 + }
  1177 + }
  1178 +
  1179 + public void stop() throws Exception {
  1180 + release();
  1181 + }
  1182 +
  1183 + public void trigger() throws Exception {
  1184 + if (oc == null || oc.isNull()) {
  1185 + throw new Exception("Could not trigger: No AVFormatContext. (Has start() been called?)");
  1186 + }
  1187 + if (pkt2.size() > 0) {
  1188 + pkt2.size(0);
  1189 + av_packet_unref(pkt);
  1190 + }
  1191 + for (int i = 0; i < numBuffers + 1; i++) {
  1192 + if (av_read_frame(oc, pkt) < 0) {
  1193 + return;
  1194 + }
  1195 + av_packet_unref(pkt);
  1196 + }
  1197 + }
  1198 +
  1199 + private void processImage() throws Exception {
  1200 + frame.imageWidth = imageWidth > 0 ? imageWidth : video_c.width();
  1201 + frame.imageHeight = imageHeight > 0 ? imageHeight : video_c.height();
  1202 + frame.imageDepth = Frame.DEPTH_UBYTE;
  1203 + switch (imageMode) {
  1204 + case COLOR:
  1205 + case GRAY:
  1206 + // Deinterlace Picture
  1207 + if (deinterlace) {
  1208 + throw new Exception("Cannot deinterlace: Functionality moved to FFmpegFrameFilter.");
  1209 + }
  1210 +
  1211 + // Has the size changed?
  1212 + if (frame.imageWidth != picture_rgb.width() || frame.imageHeight != picture_rgb.height()) {
  1213 + initPictureRGB();
  1214 + }
  1215 +
  1216 + // Convert the image into BGR or GRAY format that OpenCV uses
  1217 + img_convert_ctx = sws_getCachedContext(img_convert_ctx, video_c.width(), video_c.height(),
  1218 + video_c.pix_fmt(), frame.imageWidth, frame.imageHeight, getPixelFormat(),
  1219 + imageScalingFlags != 0 ? imageScalingFlags : SWS_BILINEAR, null, null, (DoublePointer) null);
  1220 + if (img_convert_ctx == null) {
  1221 + throw new Exception("sws_getCachedContext() error: Cannot initialize the conversion context.");
  1222 + }
  1223 +
  1224 + // Convert the image from its native format to RGB or GRAY
  1225 + sws_scale(img_convert_ctx, new PointerPointer(picture), picture.linesize(), 0, video_c.height(),
  1226 + new PointerPointer(picture_rgb), picture_rgb.linesize());
  1227 + frame.imageStride = picture_rgb.linesize(0);
  1228 + frame.image = image_buf;
  1229 + frame.opaque = picture_rgb;
  1230 + break;
  1231 +
  1232 + case RAW:
  1233 + frame.imageStride = picture.linesize(0);
  1234 + BytePointer ptr = picture.data(0);
  1235 + if (ptr != null && !ptr.equals(image_ptr[0])) {
  1236 + image_ptr[0] = ptr.capacity(frame.imageHeight * frame.imageStride);
  1237 + image_buf[0] = ptr.asBuffer();
  1238 + }
  1239 + frame.image = image_buf;
  1240 + frame.opaque = picture;
  1241 + break;
  1242 +
  1243 + default:
  1244 + assert false;
  1245 + }
  1246 + frame.image[0].limit(frame.imageHeight * frame.imageStride);
  1247 + frame.imageChannels = frame.imageStride / frame.imageWidth;
  1248 + }
  1249 +
  1250 + private void processSamples() throws Exception {
  1251 + int ret;
  1252 +
  1253 + int sample_format = samples_frame.format();
  1254 + int planes = av_sample_fmt_is_planar(sample_format) != 0 ? (int) samples_frame.channels() : 1;
  1255 + int data_size = av_samples_get_buffer_size((IntPointer) null, audio_c.channels(), samples_frame.nb_samples(),
  1256 + audio_c.sample_fmt(), 1) / planes;
  1257 + if (samples_buf == null || samples_buf.length != planes) {
  1258 + samples_ptr = new BytePointer[planes];
  1259 + samples_buf = new Buffer[planes];
  1260 + }
  1261 + frame.sampleRate = audio_c.sample_rate();
  1262 + frame.audioChannels = audio_c.channels();
  1263 + frame.samples = samples_buf;
  1264 + frame.opaque = samples_frame;
  1265 + int sample_size = data_size / av_get_bytes_per_sample(sample_format);
  1266 + for (int i = 0; i < planes; i++) {
  1267 + BytePointer p = samples_frame.data(i);
  1268 + if (!p.equals(samples_ptr[i]) || samples_ptr[i].capacity() < data_size) {
  1269 + samples_ptr[i] = p.capacity(data_size);
  1270 + ByteBuffer b = p.asBuffer();
  1271 + switch (sample_format) {
  1272 + case AV_SAMPLE_FMT_U8:
  1273 + case AV_SAMPLE_FMT_U8P:
  1274 + samples_buf[i] = b;
  1275 + break;
  1276 + case AV_SAMPLE_FMT_S16:
  1277 + case AV_SAMPLE_FMT_S16P:
  1278 + samples_buf[i] = b.asShortBuffer();
  1279 + break;
  1280 + case AV_SAMPLE_FMT_S32:
  1281 + case AV_SAMPLE_FMT_S32P:
  1282 + samples_buf[i] = b.asIntBuffer();
  1283 + break;
  1284 + case AV_SAMPLE_FMT_FLT:
  1285 + case AV_SAMPLE_FMT_FLTP:
  1286 + samples_buf[i] = b.asFloatBuffer();
  1287 + break;
  1288 + case AV_SAMPLE_FMT_DBL:
  1289 + case AV_SAMPLE_FMT_DBLP:
  1290 + samples_buf[i] = b.asDoubleBuffer();
  1291 + break;
  1292 + default:
  1293 + assert false;
  1294 + }
  1295 + }
  1296 + samples_buf[i].position(0).limit(sample_size);
  1297 + }
  1298 +
  1299 + if (audio_c.channels() != getAudioChannels() || audio_c.sample_fmt() != getSampleFormat()
  1300 + || audio_c.sample_rate() != getSampleRate()) {
  1301 + if (samples_convert_ctx == null || samples_channels != getAudioChannels()
  1302 + || samples_format != getSampleFormat() || samples_rate != getSampleRate()) {
  1303 + samples_convert_ctx = swr_alloc_set_opts(samples_convert_ctx,
  1304 + av_get_default_channel_layout(getAudioChannels()), getSampleFormat(), getSampleRate(),
  1305 + av_get_default_channel_layout(audio_c.channels()), audio_c.sample_fmt(), audio_c.sample_rate(),
  1306 + 0, null);
  1307 + if (samples_convert_ctx == null) {
  1308 + throw new Exception("swr_alloc_set_opts() error: Cannot allocate the conversion context.");
  1309 + } else if ((ret = swr_init(samples_convert_ctx)) < 0) {
  1310 + throw new Exception("swr_init() error " + ret + ": Cannot initialize the conversion context.");
  1311 + }
  1312 + samples_channels = getAudioChannels();
  1313 + samples_format = getSampleFormat();
  1314 + samples_rate = getSampleRate();
  1315 + }
  1316 +
  1317 + int sample_size_in = samples_frame.nb_samples();
  1318 + int planes_out = av_sample_fmt_is_planar(samples_format) != 0 ? (int) samples_frame.channels() : 1;
  1319 + int sample_size_out = swr_get_out_samples(samples_convert_ctx, sample_size_in);
  1320 + int sample_bytes_out = av_get_bytes_per_sample(samples_format);
  1321 + int buffer_size_out = sample_size_out * sample_bytes_out * (planes_out > 1 ? 1 : samples_channels);
  1322 + if (samples_buf_out == null || samples_buf.length != planes_out
  1323 + || samples_ptr_out[0].capacity() < buffer_size_out) {
  1324 + for (int i = 0; samples_ptr_out != null && i < samples_ptr_out.length; i++) {
  1325 + av_free(samples_ptr_out[i].position(0));
  1326 + }
  1327 + samples_ptr_out = new BytePointer[planes_out];
  1328 + samples_buf_out = new Buffer[planes_out];
  1329 +
  1330 + for (int i = 0; i < planes_out; i++) {
  1331 + samples_ptr_out[i] = new BytePointer(av_malloc(buffer_size_out)).capacity(buffer_size_out);
  1332 + ByteBuffer b = samples_ptr_out[i].asBuffer();
  1333 + switch (samples_format) {
  1334 + case AV_SAMPLE_FMT_U8:
  1335 + case AV_SAMPLE_FMT_U8P:
  1336 + samples_buf_out[i] = b;
  1337 + break;
  1338 + case AV_SAMPLE_FMT_S16:
  1339 + case AV_SAMPLE_FMT_S16P:
  1340 + samples_buf_out[i] = b.asShortBuffer();
  1341 + break;
  1342 + case AV_SAMPLE_FMT_S32:
  1343 + case AV_SAMPLE_FMT_S32P:
  1344 + samples_buf_out[i] = b.asIntBuffer();
  1345 + break;
  1346 + case AV_SAMPLE_FMT_FLT:
  1347 + case AV_SAMPLE_FMT_FLTP:
  1348 + samples_buf_out[i] = b.asFloatBuffer();
  1349 + break;
  1350 + case AV_SAMPLE_FMT_DBL:
  1351 + case AV_SAMPLE_FMT_DBLP:
  1352 + samples_buf_out[i] = b.asDoubleBuffer();
  1353 + break;
  1354 + default:
  1355 + assert false;
  1356 + }
  1357 + }
  1358 + }
  1359 + frame.sampleRate = samples_rate;
  1360 + frame.audioChannels = samples_channels;
  1361 + frame.samples = samples_buf_out;
  1362 +
  1363 + if ((ret = swr_convert(samples_convert_ctx, new PointerPointer(samples_ptr_out), sample_size_out,
  1364 + new PointerPointer(samples_ptr), sample_size_in)) < 0) {
  1365 + throw new Exception("swr_convert() error " + ret + ": Cannot convert audio samples.");
  1366 + }
  1367 + for (int i = 0; i < planes_out; i++) {
  1368 + samples_ptr_out[i].position(0).limit(ret * (planes_out > 1 ? 1 : samples_channels));
  1369 + samples_buf_out[i].position(0).limit(ret * (planes_out > 1 ? 1 : samples_channels));
  1370 + }
  1371 + }
  1372 + }
  1373 +
  1374 + public Frame grab() throws Exception {
  1375 + return grabFrame(true, true, true, false);
  1376 + }
  1377 +
  1378 + public Frame grabImage() throws Exception {
  1379 + return grabFrame(false, true, true, false);
  1380 + }
  1381 +
  1382 + public Frame grabSamples() throws Exception {
  1383 + return grabFrame(true, false, true, false);
  1384 + }
  1385 +
  1386 + public Frame grabKeyFrame() throws Exception {
  1387 + return grabFrame(false, true, true, true);
  1388 + }
  1389 +
  1390 + public Frame grabFrame(boolean doAudio, boolean doVideo, boolean doProcessing, boolean keyFrames) throws Exception {
  1391 + if (oc == null || oc.isNull()) {
  1392 + throw new Exception("Could not grab: No AVFormatContext. (Has start() been called?)");
  1393 + } else if ((!doVideo || video_st == null) && (!doAudio || audio_st == null)) {
  1394 + return null;
  1395 + }
  1396 + boolean videoFrameGrabbed = frameGrabbed && frame.image != null;
  1397 + boolean audioFrameGrabbed = frameGrabbed && frame.samples != null;
  1398 + frameGrabbed = false;
  1399 + frame.keyFrame = false;
  1400 + frame.imageWidth = 0;
  1401 + frame.imageHeight = 0;
  1402 + frame.imageDepth = 0;
  1403 + frame.imageChannels = 0;
  1404 + frame.imageStride = 0;
  1405 + frame.image = null;
  1406 + frame.sampleRate = 0;
  1407 + frame.audioChannels = 0;
  1408 + frame.samples = null;
  1409 + frame.opaque = null;
  1410 + if (doVideo && videoFrameGrabbed) {
  1411 + if (doProcessing) {
  1412 + processImage();
  1413 + }
  1414 + frame.keyFrame = picture.key_frame() != 0;
  1415 + return frame;
  1416 + } else if (doAudio && audioFrameGrabbed) {
  1417 + if (doProcessing) {
  1418 + processSamples();
  1419 + }
  1420 + frame.keyFrame = samples_frame.key_frame() != 0;
  1421 + return frame;
  1422 + }
  1423 + boolean done = false;
  1424 + while (!done) {
  1425 + if (pkt2.size() <= 0) {
  1426 + if (av_read_frame(oc, pkt) < 0) {
  1427 + if (doVideo && video_st != null) {
  1428 + // The video codec may have buffered some frames
  1429 + pkt.stream_index(video_st.index());
  1430 + pkt.flags(AV_PKT_FLAG_KEY);
  1431 + pkt.data(null);
  1432 + pkt.size(0);
  1433 + } else {
  1434 + return null;
  1435 + }
  1436 + }
  1437 + }
  1438 +
  1439 + // Is this a packet from the video stream?
  1440 + if (doVideo && video_st != null && pkt.stream_index() == video_st.index()
  1441 + && (!keyFrames || pkt.flags() == AV_PKT_FLAG_KEY)) {
  1442 + // Decode video frame
  1443 + int len = avcodec_decode_video2(video_c, picture, got_frame, pkt);
  1444 +
  1445 + // Did we get a video frame?
  1446 + if (len >= 0 && got_frame[0] != 0 && (!keyFrames || picture.pict_type() == AV_PICTURE_TYPE_I)) {
  1447 + long pts = av_frame_get_best_effort_timestamp(picture);
  1448 + AVRational time_base = video_st.time_base();
  1449 + timestamp = 1000000L * pts * time_base.num() / time_base.den();
  1450 + // best guess, AVCodecContext.frame_number = number of decoded frames...
  1451 + frameNumber = (int) Math.round(timestamp * getFrameRate() / 1000000L);
  1452 + frame.image = image_buf;
  1453 + if (doProcessing) {
  1454 + processImage();
  1455 + }
  1456 + done = true;
  1457 + frame.timestamp = timestamp;
  1458 + frame.keyFrame = picture.key_frame() != 0;
  1459 + } else if (pkt.data() == null && pkt.size() == 0) {
  1460 + return null;
  1461 + }
  1462 + } else if (doAudio && audio_st != null && pkt.stream_index() == audio_st.index()) {
  1463 + if (pkt2.size() <= 0) {
  1464 + // HashMap is unacceptably slow on Android
  1465 + // pkt2.put(pkt);
  1466 + BytePointer.memcpy(pkt2, pkt, sizeof_pkt);
  1467 + }
  1468 + av_frame_unref(samples_frame);
  1469 + // Decode audio frame
  1470 + int len = avcodec_decode_audio4(audio_c, samples_frame, got_frame, pkt2);
  1471 + if (len <= 0) {
  1472 + // On error, trash the whole packet
  1473 + pkt2.size(0);
  1474 + } else {
  1475 + pkt2.data(pkt2.data().position(len));
  1476 + pkt2.size(pkt2.size() - len);
  1477 + if (got_frame[0] != 0) {
  1478 + long pts = av_frame_get_best_effort_timestamp(samples_frame);
  1479 + AVRational time_base = audio_st.time_base();
  1480 + timestamp = 1000000L * pts * time_base.num() / time_base.den();
  1481 + frame.samples = samples_buf;
  1482 + /* if a frame has been decoded, output it */
  1483 + if (doProcessing) {
  1484 + processSamples();
  1485 + }
  1486 + done = true;
  1487 + frame.timestamp = timestamp;
  1488 + frame.keyFrame = samples_frame.key_frame() != 0;
  1489 + }
  1490 + }
  1491 + }
  1492 +
  1493 + if (pkt2.size() <= 0) {
  1494 + // Free the packet that was allocated by av_read_frame
  1495 + av_packet_unref(pkt);
  1496 + }
  1497 + }
  1498 + return frame;
  1499 + }
  1500 +
  1501 + public AVPacket grabPacket() throws Exception {
  1502 + if (oc == null || oc.isNull()) {
  1503 + throw new Exception("Could not trigger: No AVFormatContext. (Has start() been called?)");
  1504 + }
  1505 +
  1506 + // Return the next frame of a stream.
  1507 + if (av_read_frame(oc, pkt) < 0) {
  1508 + return null;
  1509 + }
  1510 +
  1511 + return pkt;
  1512 + }
  1513 +
  1514 + @Override
  1515 + public void start() throws Exception {
  1516 +
  1517 + }
  1518 +}
  1 +server:
  2 + port: ${CAMERASERVER_SERVER_PORT:8083}
  3 + servlet:
  4 + context-path: /camera
  5 +config:
  6 +#直播流保活时间(分钟)
  7 + keepalive: ${CAMERASERVER_KEEPALIVE:1}
  8 +#nginx推送地址
  9 + push_host: ${CAMERASERVER_PUSH_HOST:127.0.0.1}
  10 +#额外推送地址
  11 + host_extra: ${CAMERASERVER_HOST_EXTRA:127.0.0.1}
  12 +#nginx推送端口
  13 + push_port: ${CAMERASERVER_PUSH_PORT:1935}
  14 +#主码流最大码率
  15 + main_code: ${CAMERASERVER_MAIN_CODE:5120}
  16 +#子码流最大码率
  17 + sub_code: ${CAMERASERVER_SUB_CODE:1024}
  18 +#编译版本信息
  19 + version: '@COMMIT_REV@.@BUILD_DATE@'
  20 +
  21 +#logback
  22 +logging:
  23 + level:
  24 + com.junction: debug
  25 +#将日志输出到文件
  26 + config: classpath:camera-log.xml
  27 +
  1 +<?xml version="1.0" encoding="UTF-8"?>
  2 +<configuration>
  3 +
  4 + <property name="logDir" value="./logs" />
  5 +
  6 + <!-- 控制台 appender -->
  7 + <appender name="STDOUT"
  8 + class="ch.qos.logback.core.ConsoleAppender">
  9 + <encoder>
  10 + <pattern>[%d{yyyy-MM-dd HH:mm:ss.SSS}] [%thread] [%-5level] [%logger{50}] : %msg%n</pattern>
  11 + </encoder>
  12 + </appender>
  13 +
  14 + <!-- 按照每天生成日志文件 -->
  15 + <appender name="info-file"
  16 + class="ch.qos.logback.core.rolling.RollingFileAppender">
  17 + <rollingPolicy
  18 + class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
  19 + <!--日志文件输出的文件名 -->
  20 + <FileNamePattern>${logDir}/camera-info-%d{yyyy-MM-dd}.log</FileNamePattern>
  21 + <!--日志文件保留天数 -->
  22 + <MaxHistory>30</MaxHistory>
  23 + </rollingPolicy>
  24 + <encoder
  25 + class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
  26 + <!--格式化输出:%d表示日期,%thread表示线程名,%-5level:级别从左显示5个字符宽度%msg:日志消息,%n是换行符 -->
  27 + <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg%n</pattern>
  28 + </encoder>
  29 + <filter class="ch.qos.logback.classic.filter.LevelFilter"><!-- 只打印WARN日志 -->
  30 + <level>INFO</level>
  31 + <onMatch>ACCEPT</onMatch>
  32 + <onMismatch>DENY</onMismatch>
  33 + </filter>
  34 + <!--日志文件最大的大小 -->
  35 + <triggeringPolicy
  36 + class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
  37 + <MaxFileSize>10MB</MaxFileSize>
  38 + </triggeringPolicy>
  39 + </appender>
  40 +
  41 + <appender name="debug-file"
  42 + class="ch.qos.logback.core.rolling.RollingFileAppender">
  43 + <rollingPolicy
  44 + class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
  45 + <!--日志文件输出的文件名 -->
  46 + <FileNamePattern>${logDir}/camera-debug-%d{yyyy-MM-dd}.log</FileNamePattern>
  47 + <!--日志文件保留天数 -->
  48 + <MaxHistory>30</MaxHistory>
  49 + </rollingPolicy>
  50 + <encoder
  51 + class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
  52 + <!--格式化输出:%d表示日期,%thread表示线程名,%-5level:级别从左显示5个字符宽度%msg:日志消息,%n是换行符 -->
  53 + <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg%n</pattern>
  54 + </encoder>
  55 + <filter class="ch.qos.logback.classic.filter.LevelFilter"><!-- 只打印WARN日志 -->
  56 + <level>DEBUG</level>
  57 + <onMatch>ACCEPT</onMatch>
  58 + <onMismatch>DENY</onMismatch>
  59 + </filter>
  60 + <!--日志文件最大的大小 -->
  61 + <triggeringPolicy
  62 + class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
  63 + <MaxFileSize>10MB</MaxFileSize>
  64 + </triggeringPolicy>
  65 + </appender>
  66 +
  67 + <appender name="error-file"
  68 + class="ch.qos.logback.core.rolling.RollingFileAppender">
  69 + <rollingPolicy
  70 + class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
  71 + <!--日志文件输出的文件名 -->
  72 + <FileNamePattern>${logDir}/camera-error-%d{yyyy-MM-dd}.log</FileNamePattern>
  73 + <!--日志文件保留天数 -->
  74 + <MaxHistory>30</MaxHistory>
  75 + </rollingPolicy>
  76 + <encoder
  77 + class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
  78 + <!--格式化输出:%d表示日期,%thread表示线程名,%-5level:级别从左显示5个字符宽度%msg:日志消息,%n是换行符 -->
  79 + <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg%n</pattern>
  80 + </encoder>
  81 + <filter class="ch.qos.logback.classic.filter.LevelFilter"><!-- 只打印WARN日志 -->
  82 + <level>ERROR</level>
  83 + <onMatch>ACCEPT</onMatch>
  84 + <onMismatch>DENY</onMismatch>
  85 + </filter>
  86 + <!--日志文件最大的大小 -->
  87 + <triggeringPolicy
  88 + class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
  89 + <MaxFileSize>10MB</MaxFileSize>
  90 + </triggeringPolicy>
  91 + </appender>
  92 +
  93 + <logger name="com.junction" level="debug" />
  94 +
  95 + <root level="info">
  96 + <appender-ref ref="STDOUT" />
  97 + <appender-ref ref="info-file" />
  98 + <appender-ref ref="debug-file" />
  99 + <appender-ref ref="error-file" />
  100 + </root>
  101 +</configuration>
  1 +<!DOCTYPE html>
  2 +<html lang="en">
  3 +<head>
  4 +<title>Video.js | HTML5 Video Player</title>
  5 +<link href="http://vjs.zencdn.net/5.20.1/video-js.css" rel="stylesheet">
  6 +<script src="http://vjs.zencdn.net/5.20.1/videojs-ie8.min.js"></script>
  7 +</head>
  8 +<body width="640px" height="360px">
  9 +
  10 + <div style="margin-top: 100px; margin-left: 70px">
  11 + <h1>ʵʱԤ��</h1>
  12 + <video id="example_video_1" class="video-js vjs-default-skin" controls
  13 + preload="auto" width="352px" height="198px" data-setup="{}"
  14 + style="float: left">
  15 + <source src="rtmp://mower.kalman-navigation.com:1935/live/1a25e0e7-ca49-4d15-af57-8a858fc8a88a_1" type="rtmp/flv">
  16 + <p class="vjs-no-js">
  17 + To view this video please enable JavaScript, and consider upgrading
  18 + to a web browser that <a
  19 + href="http://videojs.com/html5-video-support/" target="_blank">supports
  20 + HTML5 video</a>
  21 + </p>
  22 + </video>
  23 +
  24 +<!-- <video id="example_video_1" class="video-js vjs-default-skin" controls-->
  25 +<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
  26 +<!-- style="float: left">-->
  27 +<!-- <source src="rtmp://127.0.0.1:1935/live/stream33" type="rtmp/flv">-->
  28 +<!-- <p class="vjs-no-js">-->
  29 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  30 +<!-- to a web browser that <a-->
  31 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  32 +<!-- HTML5 video</a>-->
  33 +<!-- </p>-->
  34 +<!-- </video>-->
  35 +
  36 +<!-- <video id="example_video_2" class="video-js vjs-default-skin" controls-->
  37 +<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
  38 +<!-- style="float: left">-->
  39 +<!-- <source src="rtmp://127.0.0.1:1935/live/stream34" type="rtmp/flv">-->
  40 +<!-- <p class="vjs-no-js">-->
  41 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  42 +<!-- to a web browser that <a-->
  43 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  44 +<!-- HTML5 video</a>-->
  45 +<!-- </p>-->
  46 +<!-- </video>-->
  47 +
  48 +<!-- <video id="example_video_3" class="video-js vjs-default-skin" controls-->
  49 +<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
  50 +<!-- style="float: left">-->
  51 +<!-- <source src="rtmp://127.0.0.1:1935/live/stream35" type="rtmp/flv">-->
  52 +<!-- <p class="vjs-no-js">-->
  53 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  54 +<!-- to a web browser that <a-->
  55 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  56 +<!-- HTML5 video</a>-->
  57 +<!-- </p>-->
  58 +<!-- </video>-->
  59 +
  60 +<!-- <video id="example_video_4" class="video-js vjs-default-skin" controls-->
  61 +<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
  62 +<!-- style="float: left">-->
  63 +<!-- <source src="rtmp://127.0.0.1:1935/live/stream36" type="rtmp/flv">-->
  64 +<!-- <p class="vjs-no-js">-->
  65 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  66 +<!-- to a web browser that <a-->
  67 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  68 +<!-- HTML5 video</a>-->
  69 +<!-- </p>-->
  70 +<!-- </video>-->
  71 +
  72 +<!-- <video id="example_video_5" class="video-js vjs-default-skin" controls-->
  73 +<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
  74 +<!-- style="float: left">-->
  75 +<!-- <source src="rtmp://127.0.0.1:1935/live/stream37" type="rtmp/flv">-->
  76 +<!-- <p class="vjs-no-js">-->
  77 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  78 +<!-- to a web browser that <a-->
  79 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  80 +<!-- HTML5 video</a>-->
  81 +<!-- </p>-->
  82 +<!-- </video>-->
  83 +<!-- </div>-->
  84 +<!-- <div style="margin-left: 70px">-->
  85 +<!-- <h1 style="clear: left; padding-top: 100px">��ʷ�ط�</h1>-->
  86 +<!-- <video id="example_video_6" class="video-js vjs-default-skin" controls-->
  87 +<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
  88 +<!-- style="float: left">-->
  89 +<!-- <source src="rtmp://127.0.0.1:1935/history/stream33" type="rtmp/flv">-->
  90 +<!-- <p class="vjs-no-js">-->
  91 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  92 +<!-- to a web browser that <a-->
  93 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  94 +<!-- HTML5 video</a>-->
  95 +<!-- </p>-->
  96 +<!-- </video>-->
  97 +
  98 +<!-- <video id="example_video_7" class="video-js vjs-default-skin" controls-->
  99 +<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
  100 +<!-- style="float: left">-->
  101 +<!-- <source src="rtmp://127.0.0.1:1935/history/stream34" type="rtmp/flv">-->
  102 +<!-- <p class="vjs-no-js">-->
  103 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  104 +<!-- to a web browser that <a-->
  105 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  106 +<!-- HTML5 video</a>-->
  107 +<!-- </p>-->
  108 +<!-- </video>-->
  109 +
  110 +<!-- <video id="example_video_8" class="video-js vjs-default-skin" controls-->
  111 +<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
  112 +<!-- style="float: left">-->
  113 +<!-- <source src="rtmp://127.0.0.1:1935/history/stream35" type="rtmp/flv">-->
  114 +<!-- <p class="vjs-no-js">-->
  115 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  116 +<!-- to a web browser that <a-->
  117 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  118 +<!-- HTML5 video</a>-->
  119 +<!-- </p>-->
  120 +<!-- </video>-->
  121 +
  122 +<!-- <video id="example_video_9" class="video-js vjs-default-skin"-->
  123 +<!-- controls="false" preload="auto" width="352px" height="198px"-->
  124 +<!-- data-setup="{}" style="float: left">-->
  125 +<!-- <source src="rtmp://127.0.0.1:1935/history/stream36" type="rtmp/flv">-->
  126 +<!-- <p class="vjs-no-js">-->
  127 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  128 +<!-- to a web browser that <a-->
  129 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  130 +<!-- HTML5 video</a>-->
  131 +<!-- </p>-->
  132 +<!-- </video>-->
  133 +
  134 +<!-- <video id="example_video10" class="video-js vjs-default-skin" controls-->
  135 +<!-- preload="auto" width="352px" height="198px" data-setup="{}"-->
  136 +<!-- style="float: left">-->
  137 +<!-- <source src="rtmp://127.0.0.1:1935/history/stream36" type="rtmp/flv">-->
  138 +<!-- <p class="vjs-no-js">-->
  139 +<!-- To view this video please enable JavaScript, and consider upgrading-->
  140 +<!-- to a web browser that <a-->
  141 +<!-- href="http://videojs.com/html5-video-support/" target="_blank">supports-->
  142 +<!-- HTML5 video</a>-->
  143 +<!-- </p>-->
  144 +<!-- </video>-->
  145 + </div>
  146 + <script src="http://vjs.zencdn.net/5.20.1/video.js"></script>
  147 +</body>
  148 +</html>
  1 +package com.junction;
  2 +
  3 +import org.junit.jupiter.api.Test;
  4 +import org.springframework.boot.test.context.SpringBootTest;
  5 +
  6 +@SpringBootTest
  7 +class CameraServerApplicationTests {
  8 +
  9 + @Test
  10 + void contextLoads() {
  11 + }
  12 +
  13 +}