[Ffmpeg-devel-irc] ffmpeg.log.20170921
burek
burek021 at gmail.com
Fri Sep 22 03:05:01 EEST 2017
[00:05:50 CEST] <rewman> ffplay over sftp works fine with an MKV file
[00:05:59 CEST] <rewman> but the same file over ftp gives me error
[00:06:06 CEST] <JEEB> you are a pervert, congratulations
[00:06:09 CEST] <rewman> Format matroska,webm detected only with low score of 1, misdetection possible!
[00:06:21 CEST] <rewman> EBML header parsing failed
[00:06:28 CEST] <rewman> why is there a difference?
[00:32:42 CEST] <sikilikis> Hey all. I have the following in my ffmpeg command: "-loop 1 -framerate $FPS -i $IMGS" where $IMG will be like "/path/to/image%03d.tiff"
[00:33:05 CEST] <sikilikis> this works just fine but I'm wondering if theres a way I can change that input on the fly, without having to start up another instance of ffmpeg
[00:33:21 CEST] <sikilikis> I am using ffmpeg to stream 24/7 to youtube
[00:41:34 CEST] <JEEB> sikilikis: not with ffmpeg.c, as it is rather static regarding inputs and outputs
[00:41:46 CEST] <JEEB> the libraries let you create something like that though
[00:43:34 CEST] <sikilikis> I'm not really feeling up to creating a "new" ffmpeg =/ but thanks for the info
[00:45:31 CEST] <sikilikis> well heres an unrelated question. That command is being used to render a "slideshow" I suppose. Is there a way to set the "visual" fps, separate from the actual fps? if that makes sense?
[00:46:03 CEST] <sikilikis> for example, if I only have two individual frames and I play that at 30 FPS, it will alternate between those frames at 30 fps so it will look like a seizure
[00:46:31 CEST] <sikilikis> instead maybe I want to do something like "alternate frames every 10 seconds" while still having the actual video render at 30 fps
[00:56:02 CEST] <dingwat> Is there a way to halt an encoding process in a way that leaves the video file playable? As in, I'm halfway through encoding a series of images into a video, but is there a way to stop it there but still have a "finished" video file?
[00:57:07 CEST] <c_14> sigint or q
[00:57:14 CEST] <c_14> should make ffmpeg quit gracefully
[00:57:42 CEST] <dingwat> Thanks, I'll try that sometime!
[00:59:44 CEST] <dingwat> Also, interestingly, this encoding process has been super stable at 10.2GB of RAM. CPU and network bandwidth are jumping all over the place, but the memory consumption is very consistent
[01:00:56 CEST] <dingwat> And, good news, I'm almost halfway done. It's been less than 2hrs, so I guess that's relatively quick
[01:07:55 CEST] <JEEB> dingwat: yes these things try to keep the number of buffers as static as possible and they try to allocate the anticipated ones in the beginning
[01:13:46 CEST] <Johnjay> >10.2GB of RAM
[01:13:56 CEST] <Johnjay> what in the world are you encoding that takes that much space
[01:14:09 CEST] <JEEB> eight thousand and something of width
[01:14:23 CEST] <JEEB> so yes the buffers will take space
[01:15:18 CEST] <dingwat> JEEB: ah gotcha that makes sense
[01:21:29 CEST] <Johnjay> i should probably know more about TVs than I do
[01:21:38 CEST] <Johnjay> But I thought 4k res was the big thing now, not 8k.
[01:22:21 CEST] <redrabbit> tv still uses mp2 / mpeg2 on most sat. feeds
[01:22:25 CEST] <redrabbit> gross
[01:22:44 CEST] <redrabbit> also, avc/mp2
[01:22:58 CEST] <redrabbit> avc/aac if lucky
[01:24:40 CEST] <dingwat> Johnjay: Japan has stated that the 2020 Olympics will be in 8K. I'm just playing around though, I have some 8K source material and I'd like to see how Youtube handle it
[01:25:14 CEST] <Johnjay> i've seen a 4k option on youtube i think
[01:25:27 CEST] <Johnjay> but idk if it would really take an 8k upload. but then i don't use youtube much
[01:25:30 CEST] <redrabbit> yep
[01:26:00 CEST] <Johnjay> youtube uses ffmpeg to encode and decode, correct?
[01:26:12 CEST] <dingwat> There is 8K content on Youtube. Which is actually from a 6K camera. I have rendered content though, not from a camera
[01:26:14 CEST] <redrabbit> it uses gross stuff
[01:26:49 CEST] <Johnjay> i remember being surpised to hear that. i would thought google would buy up and take over anything it depended on
[01:27:13 CEST] <Johnjay> Like fork ffmpeg and rebrand it as Googpeg or something, lol
[01:27:57 CEST] <redrabbit> sounds like a brand of vacuum sealed vomit
[01:28:06 CEST] <dingwat> Welp. I now have an 8K video, supposedly, but neither of my computers can play it.
[01:28:06 CEST] <blap> http://img.pr0gramm.com/2017/09/21/b91ade6e19139a4f.png My latest unicode IRC troll pic :P
[01:28:24 CEST] <Johnjay> i'm sure google would call it something trendy
[01:28:35 CEST] <Johnjay> But yeah the microsoft strategy. Embrace, extend, extinguish
[01:32:02 CEST] <JEEB> dingwat: yea I'm doing my 2160p60 content the same way. game capture
[01:32:25 CEST] <JEEB> thankfully some engines let you render at some random speed
[01:32:31 CEST] <JEEB> and not miss a frame
[01:32:45 CEST] <JEEB> (as in, your actual game doesn't have to run realtime)
[01:48:07 CEST] <Johnjay> alright I have a 4:04:50 length file
[01:48:20 CEST] <Johnjay> I told -segment_times to do 3600,3600,3600,4000
[01:48:24 CEST] <Johnjay> time to see if that work
[02:56:32 CEST] <Johnjay> alright it didn't work
[02:56:39 CEST] <Johnjay> falling back on the command on the super user page
[02:56:46 CEST] <Johnjay> https://superuser.com/questions/820747/slicing-video-file-into-several-segments
[03:30:45 CEST] <Djfe> Hi! :) I'm downloading an hls stream atm.
[03:31:09 CEST] <Djfe> and I'm getting red warnings/errors from the tls part of ffmpeg:
[03:31:28 CEST] <Djfe> Unable to read from socket
[03:31:37 CEST] <Djfe> Failed to send close message
[03:32:12 CEST] <Djfe> are these actual errors or can they be ignored like warnings since tls ensures stream integrity?
[03:59:41 CEST] <Cracki_> I know that guy
[04:02:43 CEST] <Djfe> https://pastebin.com/GsAz204m
[04:08:18 CEST] <Djfe> I'm using ffmpeg over a vpn connection which isn't completely stable. it seems to cause the tls connection to fail every now and then.
[04:09:07 CEST] <Djfe> are those messages supposed to be forwarded from the tls error output to the ffmpeg console
[04:46:20 CEST] <blap> i get those too so I use a client that is able to restart
[04:54:41 CEST] <echelon> hi, the mkv file i recorded from my webcam doesn't have any playback option
[04:55:02 CEST] <echelon> i guess there's no index?
[04:55:08 CEST] <echelon> is there a way to fix it?
[05:08:37 CEST] <Djfe> what do you mean by playback option?
[05:09:17 CEST] <echelon> i can't skip ahead or backward
[05:09:22 CEST] <echelon> anyway, i figured it out
[05:09:46 CEST] <echelon> just passed it through copy
[05:10:19 CEST] <echelon> anyway, i used -live 1 flag when i was recording and it didn't let me skip forward/back then either
[05:16:50 CEST] <Djfe> +1
[05:32:20 CEST] <echelon> oh
[05:39:25 CEST] <Djfe> "oh"?
[05:40:16 CEST] <Djfe> I meant by that good that you got your problem solved ;)
[05:40:33 CEST] <Djfe> (or do you another issue?)
[05:48:13 CEST] <Djfe> *have
[05:51:49 CEST] <Djfe> bye
[05:51:51 CEST] <kepstin> echelon: yes, using the '-live 1' option will cause ffmpeg to not write an index
[05:52:07 CEST] <kepstin> echelon: if you're saving to disk, then don't use that option.
[05:52:26 CEST] <kepstin> (without that, ffmpeg will go back and add the index when it's done)
[06:09:45 CEST] <echelon> kepstin: but i'm not able to go forward/backward even when i'm trying to read from the file while it's recording with the option
[06:09:52 CEST] <echelon> am i supposed to stream it?
[09:57:18 CEST] <Gaulois94> Hello
[10:04:44 CEST] <Gaulois94> For avio_alloc_context, what is the "whence" parameter ?
[10:08:08 CEST] <klaxa> Gaulois94: from what i can tell it's the same as for fseek (see man fseek)
[10:08:26 CEST] <klaxa> it specifies from where the seek should be done (beginning, current position, end)
[10:08:47 CEST] <Gaulois94> Ok, just to be sure
[10:09:44 CEST] <Gaulois94> And for a std::istream it should return the value of tellg, right ?
[10:10:49 CEST] <klaxa> no idea what istream is, but ftell() on FILE* structs should return the current position of an opened file
[10:21:51 CEST] <chuckleplant> Hi (topic from yesterday), I'm using Live555 to receive an H264 stream from a camera. I am using libavcodec to decode the stream, and later OpenGL to render. All of that is working. However, as soon as I get a frame, I render it. This causes some image stuttering, as I'm not using timestamp info... Now, I'm trying to retreive the presentation timestamp (PTS) from the decoded AVFrame, but this value is not set.
[10:21:56 CEST] <chuckleplant> From Live555 h264 parsing, I only get NAL unit PTS, which I feed to AVPacket. I also do not have a proper AVCodecCtx->time_base, neither the camera provides the time_base (time_scale) info in the SPS/PPS (it is optional according to the standard)
[10:22:02 CEST] <chuckleplant> How could I, in this scenario, account for presentation timestamps? I'm sure there's something else I'm not using or misusing. But can't really spot it.
[13:24:55 CEST] <Gaulois94> Hi, do you guys have a good tutorial to how streaming screenshots using FFMPEG and RTP protocol ?
[13:45:19 CEST] <dingbat> Welp, my 8K video upload to youtube succeeded. Eventually. Plays like shit, but meh
[13:47:35 CEST] <furq> 65mbit huh
[13:47:40 CEST] <furq> i'm not surprised it plays like shit
[13:48:27 CEST] <furq> wait what
[13:48:43 CEST] <furq> 8k avc is 65mbit but 8k vp9 is 15mbit
[13:48:49 CEST] <furq> i don't even want to consider how they've worked that out
[13:55:55 CEST] <ritsuka> It would be nice to know hot long it took to convert the 8k video to vp9 :|
[14:10:37 CEST] <bidiko> Hi, How can i use FFmpeg in my IOS project for RTSP connection ? I have one sample project but I'm getting memory leak error
[14:14:11 CEST] <durandal_1707> cant guess
[14:18:56 CEST] <bidiko> actually I'm using FFmpeg library in my project and I can connect RTSP camera. But I'm getting memoryl leak error and I need help how can i upgrade my FFmpeg library
[14:26:10 CEST] <SavinaRoja> does anyone have experience with FFMPeg and mpeg-dash production?
[14:28:19 CEST] <JEEB> SavinaRoja: the DASH muxer/mpd writer seems to have worked last I tried
[14:29:05 CEST] <SavinaRoja> I've been working on this project for days and never heard of that... documentation link?
[14:30:08 CEST] <stevenliu> ffmpeg -h muxer=dash
[14:30:22 CEST] <SavinaRoja> JEEB: I'm getting apparently valid MPD files but validation fails on parsing of the mp4 files
[14:30:55 CEST] <JEEB> use l-smash's boxdumper to check the structure of the fragments
[14:31:04 CEST] <JEEB> `boxdumper --box file.m4s | less`
[14:31:05 CEST] <JEEB> for example
[14:31:28 CEST] <JEEB> (or you can dump the output to a text file)
[14:31:52 CEST] <SavinaRoja> thanks, I'll give this a go
[15:21:18 CEST] <SavinaRoja> so I tried out the dash muxer on some test input https://pastebin.com/u8rA0Ri7
[15:22:42 CEST] <SavinaRoja> and I uploaded it to a vps for testing and playback is still an issue
[15:28:31 CEST] <SavinaRoja> http://dashif.org/conformance.html appears to choke on a plain IP URL
[15:38:01 CEST] <fahadash> I am trying to cut audio of a portion of a video, is this command correct? .\ffmpeg.exe -i ".\video.mp4" -ss 15 -t 7 -vn -c copy audio.mp3
[15:38:56 CEST] <fahadash> I cannot see the output, I am on windows 10, the ffmpeg.exe crashes upon run when run under cmd.exe, I have to run it under powershell which launches a separate console window and the output scrolls fast and the window closes.
[15:41:43 CEST] <SavinaRoja> fahadash: the command options look fine, are you sure that.\ffmpeg.exe is the correct path?
[15:42:45 CEST] <SavinaRoja> "where.exe ffmpeg" should tell you if it can be found by just "ffmpeg"
[15:57:27 CEST] <fahadash> I got it. Thanks
[16:17:10 CEST] <SavinaRoja> does the ffmpeg dash muxer provide support for multiple representations?
[18:08:59 CEST] <Johnjay> is there a way to divide a stream into N equal chunks?
[18:09:52 CEST] <tdr> playable chunks or simply smaller ones (for transfering etc) ?
[18:10:09 CEST] <Johnjay> er playable. it's nbd if not
[18:10:54 CEST] <Johnjay> if I just say an hour or 45 min or 30 min I end up with an extra chunk of 5-10 min and can't remember the concat syntax
[18:11:41 CEST] <Johnjay> actually this problem would be solved if I could insert audio markers at the start of each chunk
[18:11:56 CEST] <Johnjay> like 2 beeps for part 2, 3 beeps for part 3, etc
[18:12:04 CEST] <Johnjay> maybe i could write a script for that
[18:15:33 CEST] <furq> Johnjay: segment muxer?
[18:17:22 CEST] <Johnjay> I don't see anything in the doc about it
[18:17:49 CEST] <Johnjay> although now that I look there is an option called segment_start_number I could use to start numbering at 1. lol
[18:18:23 CEST] <Johnjay> My use case is a sports mp3 player with no interface except next, prev, and play/pause
[18:18:59 CEST] <furq> !muxer segment
[18:18:59 CEST] <nfobot> furq: http://ffmpeg.org/ffmpeg-formats.html#segment_002c-stream_005fsegment_002c-ssegment
[18:20:35 CEST] <Johnjay> yeah that
[18:23:01 CEST] <nohop> hey guys. I'm already using ffmpeg in my project to encode en write video streams to disk. Now, I need to create a web interface with video, so I need raw video buffers before they're written to file
[18:23:26 CEST] <nohop> what would be the best way to have ffmpeg give me a callback and give me the buffer, instead of writing it to file ?
[18:30:49 CEST] <Mavrik> pipe it as a second output I guess>
[18:30:51 CEST] <Mavrik> ?
[18:31:00 CEST] <Mavrik> Named papes or what are they called on unix
[18:32:11 CEST] <BtbN> raw video frames to show them on a web interface sounds like a bad idea
[18:32:33 CEST] <nohop> Mavrik: yeah... but that seems hacky as fuck :)
[18:32:43 CEST] <nohop> BtbN: is it ? hmm :)
[18:33:19 CEST] <BtbN> They are huge and I don't know any sane format to display raw frames in a browser
[18:33:41 CEST] <BtbN> If you want a screenshot, png/jpeg. Otherwise just send video there.
[18:33:59 CEST] <nohop> 'just send video there' is what i want
[18:34:18 CEST] <nohop> problem is, my video is a ton of uncomperssed pixels at a high framerate
[18:34:35 CEST] <Mavrik> hmm
[18:34:42 CEST] <Mavrik> how about you setup a streaming server and tell it to record
[18:34:48 CEST] <Mavrik> and then just stream encoded video to it?
[18:35:38 CEST] <nohop> my application will be the server. The encoding and sending out encoded video is exactly what i want to do :)
[19:01:36 CEST] <saml> how do I set up render farm?
[19:01:43 CEST] <saml> i want something yolo fast
[19:03:58 CEST] <saml> like distributed ffmpeg cluster to encode videos real fast
[19:26:41 CEST] <nohop> BtbN: So, what would you recommend using to compress raw pixel data to send over http and show on a webpage ?
[19:37:10 CEST] <kepstin> nohop: ffmpeg with the hls or dash muxer and nginx (or ffmpeg with the rtmp muxer sending to the nginx-rtmp module, which can generate hls)
[19:37:39 CEST] <nohop> hmmm
[19:38:01 CEST] <nohop> so i can't just get a buffer with it's output ? We're using our own webserver. No nginx
[19:46:18 CEST] <kepstin> if you really want that, you should be able to make a custom AVIOContext and have nginx write to that
[19:46:53 CEST] <nohop> hmmm
[19:46:58 CEST] <nohop> i think there's a misunderstanding
[19:47:04 CEST] <kepstin> have ffmpeg write to that i mean
[19:47:07 CEST] <kepstin> sorry, typo
[19:47:07 CEST] <nohop> oh
[19:47:17 CEST] <nohop> i see, sorry :)
[19:47:23 CEST] <nohop> okay, i'll look into that then
[19:47:47 CEST] <kepstin> but that's gonna be tricky to deal with, since then you're responsible for getting the video to the browser... somehow, and browsers are very picky about how they accept video
[19:48:04 CEST] <kepstin> so you're gonna have to do all the segmentation yourself then, which is of course hard.
[19:48:31 CEST] <kepstin> (I don't *think* there's a way to use the ffmpeg segment muxer with a custom AVIOContext? could be wrong)
[19:48:48 CEST] <nohop> Yeah, I see what you mean
[19:49:29 CEST] <nohop> i might be better off using a horrible mjpeg stream :)
[19:54:06 CEST] <furq> well if you want to use rawvideo (or mjpeg) then you'll need to have some custom handling for it anyway
[19:54:17 CEST] <furq> in the browser, i mean
[19:54:18 CEST] <furq> so it doesn't matter that much how you deliver it
[19:54:39 CEST] <furq> rawvideo isn't really going to work over the web though unless it's very low resolution
[19:55:51 CEST] <furq> if you actually want to use the browser's builtin video player, then you're pretty much stuck with hls or dash
[19:56:42 CEST] <furq> and those are just mp4/mpegts/webm fragments served over http, so your webserver doesn't have to do anything other than serve files
[19:57:13 CEST] <nohop> except i don't have files :)
[19:59:52 CEST] <nohop> is this something ffmpeg could help me with, or am i better off getting some library aimed at doing this ? (as in, raw frames in -> servable bytestream out)
[20:11:58 CEST] <JEEB> nohop: FFmpeg's libraries are just for that in my opinion
[20:12:47 CEST] <JEEB> you make an AVFrame out of the raw samples, then feed that to a colorspace conversion filter if you need to switch between YCbCr and RGB one way or the other, then feed that result to an encoder
[20:13:20 CEST] <JEEB> then you will get stuff out of the encoder (AVPackets) which you can then feed to the multiplexer that writes into some protocol
[20:26:51 CEST] <kepstin> nohop: well, you could have files; just have ffmpeg's dash or hls muxer write files to a temp directory, then serve them up :)
[20:27:22 CEST] <kepstin> nohop: the main problem is that for use in a web browser, there's no single servable bytestream you can use
[20:28:39 CEST] <kepstin> I suppose you could maybe do something clever with sending stuff in a websocket connection then using mediasource extensions in javascript in the browser
[20:29:30 CEST] <Mavrik> This is usually done by a streaming server instead of ffmpeg itself.
[20:29:36 CEST] <Mavrik> But apparently he's doing something funny.
[20:30:20 CEST] <kepstin> yeah, this is very much in "I want to write my own streaming server from scratch" territory, and that is of course very hard work :)
[20:30:51 CEST] <Mavrik> I'd just deploy nginx-rtmp and let that handle it :P
[20:39:28 CEST] <nohop> i know, i'm always doing things the hard and stupid way :)
[20:39:54 CEST] <nohop> But i'll try to convince my colleagues that we need nginx :)
[20:40:19 CEST] <furq> nohop: ffmpeg itself will write hls/dash fragments and the associated playlist
[20:40:38 CEST] <furq> the libs will too but you don't really need it, you can just pipe rawvideo into ffmpeg
[20:41:22 CEST] <nohop> yeah, but we're using the libs already. For now it just only outputs to files
[20:41:31 CEST] <nohop> but they want a web interface aswell
[20:42:31 CEST] <Johnjay> nohop: I looked everywhere for a rtmp compatible version of nginx
[20:42:43 CEST] <Johnjay> turned out you have to download version 7.11.3 or some thing
[20:42:54 CEST] <Johnjay> because obviously 7.11.2 just wasn't good enough for that
[20:42:57 CEST] <Mavrik> There's a free plugin with a slightly misleading name " nginx-rtmp"
[20:43:09 CEST] <Mavrik> Which also supports HLS and DASH (hence the misleading name)
[20:43:19 CEST] <Mavrik> You can feed it directly from ffmpeg and it'll do segmentation and playlist stuff
[20:43:35 CEST] <Mavrik> So basically [source] --> [ffmpeg] --> [nginx] ====> clients
[20:43:51 CEST] <Mavrik> It's not the best streaming server I've seen, but it's plenty decent for a free one :)
[20:46:50 CEST] <furq> and it's also very easy to set up
[20:46:59 CEST] <furq> which isn't something i can say for any other free streaming server i've ever tried
[20:47:57 CEST] <RonaldsMazitis> hello
[20:48:01 CEST] <RonaldsMazitis> I am trying javascript version of ffmpeg
[20:48:03 CEST] <RonaldsMazitis> and it does not find files in current directory
[20:48:08 CEST] <RonaldsMazitis> how to find which directory my ffmpeg is in
[20:49:29 CEST] <RonaldsMazitis> http://skatetube.sytes.net/VIDEOEDITOR/demo/
[20:49:39 CEST] <RonaldsMazitis> this is my javascript ffmpeg
[20:50:01 CEST] <RonaldsMazitis> I need to know why it does not read files from current directory
[20:50:51 CEST] <RonaldsMazitis> anyone?
[21:02:11 CEST] <ChocolateArmpits> RonaldsMazitis, are you sure ffmpeg should be handling file finding?
[21:04:00 CEST] <RonaldsMazitis> it should work on current directory
[21:05:07 CEST] <RonaldsMazitis> http://bgrins.github.io/videoconverter.js/demo/
[21:05:12 CEST] <RonaldsMazitis> this is original
[21:05:29 CEST] <RonaldsMazitis> "Also imagine that you have files called input.webm and input.jpeg in the current directory. "
[21:07:10 CEST] <rjp421> i recently updated my rpi2 packages+kernel. when i tried to run ffmpeg, i got a missing .so (from libwebp.so.2 to .3 update)..... so i 'git pull'ed, and 'make uninstall ; make distclean ; make clean'... 'make -j5 && make install'.. i also rebuilt the /opt/vc/hello_pi with raspivid... but piping raspivid to ffmpeg suddenly gives Illegal Instruction https://pastebin.com/raw/hWRK9dVC
[21:08:07 CEST] <rjp421> * piped to ffmpeg/ffprobe
[21:10:28 CEST] <rjp421> this worked fine before the updates... how do i get further detail on the crash? even with loglevel debug, its not clear
[21:12:12 CEST] <rjp421> i would use the h264_omx, but i need to v/hflip the video
[21:12:25 CEST] <JEEB> make sure you build with debug symbols (aka, --disable-strip I think?)
[21:12:34 CEST] <rjp421> afaik i couldnt do both at once?
[21:12:48 CEST] <rjp421> JEEB, ok ty
[21:12:49 CEST] <JEEB> and then build your own thing with -g3 -ggdb
[21:12:57 CEST] <JEEB> (the hello_pi thing)
[21:13:28 CEST] <JEEB> and then install gdb and follow rms's gdb tutorial http://www.unknownroad.com/rtfm/gdbtut/gdbuse.html#RUN
[21:13:32 CEST] <JEEB> welcome to debugging on *nix .)
[21:13:33 CEST] <JEEB> :)
[21:14:05 CEST] <rjp421> ty, i will try. i just use their included "rebuild.sh" for the hello_pi (with raspivid)
[21:14:25 CEST] <JEEB> make sure it has debug symbols enabled (-g3 -ggdb is what I usually use for compiler flags)
[21:14:33 CEST] <JEEB> otherwise you will lose all visibility
[21:14:49 CEST] <JEEB> basically, after `gdb program_name` it (gdb) should tell you that it was able to load symbols
[21:14:58 CEST] <JEEB> if it can't find symbols, you're in trouble
[21:14:59 CEST] <JEEB> :P
[21:15:26 CEST] <JEEB> without symbols backtraces are rather... useless and uninformative
[21:15:36 CEST] <JEEB> at most you find out the rough function something crashed in
[21:15:42 CEST] <JEEB> if even that
[21:16:55 CEST] <RonaldsMazitis> so why my javascript ffmpeg doesn't know current directory
[21:19:35 CEST] <BtbN> Because it's running in a Browser and is isolated from everything, as it should be.
[21:19:37 CEST] <rjp421> JEEB, "Reading symbols from /opt/vc/bin/raspivid...done." look good? should i still rebuild it?
[21:19:48 CEST] <JEEB> that sounds positive
[21:19:59 CEST] <JEEB> now if it finds the symbols from FFmpeg libraries as well
[21:20:06 CEST] <JEEB> since by default I think FFmpeg strips
[21:20:12 CEST] <JEEB> when it installs
[21:24:12 CEST] <rjp421> ok, rebuilding with --disable-stripping
[21:25:21 CEST] <JEEB> or it was --disable-strip
[21:25:28 CEST] <JEEB> check if you get a warning when you try to configure
[21:25:38 CEST] <JEEB> (--help |grep "strip" should show it)
[21:29:37 CEST] <Mavrik> yeah, disable-strip and --enable-debug=3 helps
[22:01:48 CEST] <rjp421> still compiling
[22:13:50 CEST] <rjp421> now installing
[22:15:09 CEST] <JEEB> rjp421: that's why I generally cross-compile and just rsync/copy/whatever to the ARM thing
[22:15:12 CEST] <JEEB> because ARM things are *slow*
[22:17:11 CEST] <rjp421> JEEB, how (should) to i get a coredump/backtrace? still just exiting with Illegal Instruction
[22:17:27 CEST] <JEEB> when it crashes under gdb you should be able to get a backtrace
[22:17:38 CEST] <rjp421> ah
[22:17:48 CEST] <JEEB> see the tutorial I noted
[22:21:22 CEST] <rjp421> JEEB, do i run gdb on raspivid or ffmpeg? im piping the raspivid to ffmpeg, which is when ffmpeg crashes. looking at that tutorial, im not sure which
[22:22:01 CEST] <rjp421> gdb ffmpeg, then run <cmd>?
[22:22:06 CEST] <JEEB> yes
[22:22:09 CEST] <rjp421> ty
[22:30:52 CEST] <rjp421> JEEB, doesnt look right https://pastebin.com/raw/2zycD73c ill try googling but not sure how to word the search
[22:31:40 CEST] <JEEB> yea, you can only run a single thing in gdb
[22:32:52 CEST] <rjp421> maybe rapsivid into a fifo, and use as input to ffmpeg?
[22:33:27 CEST] <JEEB> or https://stackoverflow.com/questions/455544/how-to-load-program-reading-stdin-and-taking-parameters-in-gdb?noredirect=1&lq=1#comment60937664_7473496
[22:33:34 CEST] <rjp421> ty
[22:34:01 CEST] <JEEB> the fact that you can start running the program with -ex 'r parameters for thing'
[22:34:14 CEST] <JEEB> and then the program at the end
[22:47:50 CEST] <rjp421> -ex 'set args -loglevel debug -i - < raspivid ..... -o -' ? that the same as piping?
[22:54:33 CEST] <rjp421> its not working :\
[23:17:10 CEST] <rjp421> JEEB, i wasnt able to figure out a working syntax, but i did get a coredump
[23:20:48 CEST] <rjp421> Program terminated with signal SIGILL, Illegal instruction.
[23:20:48 CEST] <rjp421> #0 ff_h264_idct_dc_add_neon () at libavcodec/arm/h264idct_neon.S:77
[23:20:48 CEST] <rjp421> 77 vld1.16 {d2[],d3[]}, [r1,:16]
[23:21:29 CEST] <rjp421> JEEB, ^ sorry for flood
[23:26:38 CEST] <rjp421> https://pastebin.com/raw/APRbBSgw lmk if i should paste 'bt full'
[23:29:22 CEST] <rjp421> looks like something to do with decoding the h264 from the board?
[23:33:31 CEST] <furq> rjp421: did you run rpi-update recently
[23:33:41 CEST] <furq> it'll update the vc libs and headers
[23:34:02 CEST] <furq> if you're running ffmpeg from git then you probably want the latest vc stuff
[23:34:42 CEST] <furq> i've had similar problems in the past where i've had segfaults from running old vc libs even though it still built against them
[23:35:26 CEST] <rjp421> furq, im using kali which doesnt have rpi-update... but i think still an old /opt/vc/.. i will get the latest src and rebuild ty
[23:35:47 CEST] <furq> also yeah you probably want to cross compile ffmpeg if you have another *nix box
[23:35:56 CEST] <furq> it's very easy on a recent-ish debian or ubuntu
[23:52:56 CEST] <rjp421> git cloning the rpi-firmware is taking a long time... i just need the opt/vc folder
[23:53:19 CEST] <Johnjay> rjp421: I have that folder. it's got neat stuff in it
[23:53:26 CEST] <Johnjay> that's where the temp monitoring command is iirc
[23:56:06 CEST] <rjp421> Johnjay, if you happen to have a pi-cam, and updated packages+kernel+ffmpeg-git, would you mind please piping raspivid to ffmpeg/ffprobe and see if you get the same crash as me?
[23:57:21 CEST] <rjp421> im using kali which is not standard debian, maybe my setup
[23:57:24 CEST] <Johnjay> ah cam is the one thing i didn't play around with on there
[23:58:12 CEST] <Johnjay> i didn't even know raspivid was the command for camera until you said it
[23:58:12 CEST] <rjp421> np :) if anyone else can try though it would be appreciated
[23:58:15 CEST] <Johnjay> a problem i often run into on linux
[23:58:49 CEST] <Johnjay> may I ask did you get mpv to work on it? or just ffmpeg?
[23:59:22 CEST] <rjp421> Johnjay, its headless atm, no gui, so no mpv
[23:59:33 CEST] <Johnjay> oh
[00:00:00 CEST] --- Fri Sep 22 2017
More information about the Ffmpeg-devel-irc
mailing list