personal: https://www.mediawiki.org/wiki/User:Brooke_Vibber
official work: https://www.mediawiki.org/wiki/User:Brooke_Vibber_(WMF)
personal: https://www.mediawiki.org/wiki/User:Brooke_Vibber
official work: https://www.mediawiki.org/wiki/User:Brooke_Vibber_(WMF)
Can we resolve this now?
@ABran-WMF It really ought to have finished by now. ;) Go ahead and kill it, it's idempotent and I can clean up its state later, but I'll want to save the output log to check what went wrong.
I'll see if I can dash out a patch for this, it'll be nice to have during testing
In T374746#10323980, @CCiufo-WMF wrote:In T374746#10323572, @bvibber wrote:Config change is deployed and we have working cache invalidation on test+test-commons. Ready for sign-off and closing?
To test this, do I just make a change to a Data ns page and wait for it to propagate?
Updated it a bit. May need to include some permission changes for NS_DATA on Commons? Under discussion.
Added config patch for globaljsonlinks enabling. This should be expanded to also enable charts on t2/commons.
Config change is deployed and we have working cache invalidation on test+test-commons. Ready for sign-off and closing?
@brennen ah yes my old nemesis, getid3's MPEG parser
Agreed, that makes sense. Prelim config patch attached on T379199 -- @Ladsgroup let me know if this looks right to you!
So if I understand correctly, we need two databases on x1 which will each hold globaljsonlinks, globaljsonlinks_target, and globaljsonlinks_wiki tables: one for testcommons for the testcommons deployment, and one for the actual commons-based deployment.
Let's close it out -- the Chart-side patch is officially part of next task T374746 :D
Rough plan for adding this on JsonConfig:
Due to issues running a large series of short scripts through k8s, have restarted these on mwmaint2002 using the old school method. :D
Flat mov MJPEG output is active and the last of the HLS tracks are disabled for generation.
Flat mov MJPEG output is active and the last of the HLS tracks are disabled for generation (fix via T363966).
Will be pushing the new config live today, which should start MJPEG+MP3 back-compat tracks in standard QuickTime flavor (replacing the HLS stuff). Support code for MPEG-4 Visual to provide a higher resolution in the same bandwidth is ready to go when we receive legal ok.
These went into chart-renderer (T373381) in https://gitlab.wikimedia.org/repos/mediawiki/services/chart-renderer/-/merge_requests/22
If we only track json usage centrally and not locally, we should also make sure third party (non-WMF and standalone) wiki users does not get this (unused) globaljsonlinks table or any other table created after they installed JsonConfig with default setting and run update.php.
I'm leaning towards option number 3. This would have a jsonlinks like:
Ok, confirmed GlobalUsage is enabled on all non-closed, non-private wikis:
Chart is not the only .tab consumer. See also T153966: Track Commons Dataset usage across wikis (what links here).
Some notes:
Good to know! Still doesn't hurt to cut the load in half. ;)
I'm gonna do some cleanup and retire the .m3u8 soon until i retool some stuff, so that may help with this. :D Assigning to myself for some cleanup during my tech debt time.
Going to tweak the JS side to correctly return the parameter position data so we can replace it in the right location...
Poking at this to simplify the config setup, based on the experience testing the commons/remote support. :D
If no objection I'll take this this sprint :D
Now unable to view a video on the web of the app like this one on iOS 15.8.3 (recently released) in an iPhone 6s Plus. It's either still loading or unable to view the video. Able to view it on a third-party player app, like VLC.
In T368433#10036719, @Yann wrote:Are 1080p transcodes also disabled? https://commons.wikimedia.org/wiki/File:White_Tiger_%281923%29_by_Tod_Browning.webm
hah whoops. easy fix at least :D
Provisional logic https://github.com/wikimedia/wikipedia-ios/pull/4903
Updated task description with results of planning & discussion from last week, resolving as complete for now.
Some local wikis may want to store their project-internal data, such as number of open block appeals per day. They may hope they can use a page in local wiki (not Commons), convert it to Tabular Data data model, then it can be edited and used like Commons Data page (cf T252711).
I'm starting a batch run on any Commons audio files that never got started before, then will run a (slower) batch run on anything that got attempted but didn't complete. These may take a few days to completion through the full dataset (it's not well optimized to look over only certain files yet), but hopefully should help.
We don't actively need that lint yet for mobile apps work so it's completely safe to disable it. Hopefully that'll clear things up on the database!
I think the minimum we would need is the ability to select only some columns from a larger data set. For example, for a table with ballot measure election results, you'd want to be able to pull out just "County" and "For %" to make a simple bar chart.
Secondarily, if we need to be able to query, do we need to be able to subset/filter? The filter options in vega are javascript and thus dangerous, so we want to be very careful and explicit about any filter language we define. It's simplest to avoid this and leave it to the existing wikidata query modules or whatever?
Some quick notes catching up:
links obtained
Going to get some links from Chris on past Graphs usage that'll help me in this research :D
Seems to have stabilized:
Hmm, it's down under 4k entries but still high.
Open questions:
(notes for alternate method using mostly client-side logic)
Yeah, I should clean up the labeling so it's clearer. :D
Ok, 1440p and 2160p transcodes are temporarily disabled for now until better fixes, and we did a kill of the old stuck processes. Might still take a bit to shake everything out; I'm trying to flush through all the missing audio.
I'm seriously considering bringing back my "chunked" scheme that would at least produce smaller, standalone jobs that encode say 10 seconds worth of video, then reassemble the final into a single video at the end. :P Main reason I haven't is that the logic needs to be able to handle missing chunks if individual ones time out or fail and that sounds like a pain, but it'll be a lot friendlier to the job queue infrastructure.
We found that timeouts didn't seem to be handled correctly:
Looks like we've got a couple problems with high-res videos:
As designed: these are bare video tracks, to be paired with the mp3 or Opus audio tracks in HLS or MPEG-DASH playback. They are not meant to be played standalone.
The list of "active" (may or may not actually be active) includes a number of 2160p high-res videos hitting since June 21. We've also gotten reports before about certain kinds of AV1 videos slowing down the input handling, which I haven't checked for.
I'm bulk-adding the missing audio transcodes which should force them to run through as fast as possible between other jobs, and hopefully will handle the prioritized queue split better.
Live system thinks it has 9,223 items queued on commons and requeue is throttling there for now.... occasionally it goes down an item and moves on.
Batch requeueTranscodes failured on June 22 with this error:
Could be a backfill run but that shouldn't be interfering with anything... I'll check on it
Ah, even better. Figured out how to make a .mov with MP3 audio track working, which means I should be able to ship a corrected, more compatible 144p MJPEG/MP3 QuickTime fallback very soon to replace the previous version using 144p MJPEG/MP3 in the HLS streaming.
Possible workaround:
Taking this for feasibility spike. :)