User Details
- User Since
- Dec 9 2015, 5:27 AM (466 w, 5 d)
- Availability
- Available
- LDAP User
- Unknown
- MediaWiki User
- Vort [ Global Accounts ]
Nov 19 2023
Nov 17 2023
Run, which executed this night, was successful.
So if nothing wrong happens in following days, this task will be closed.
Nov 16 2023
I started conversion process, but it will take several days to set up everything correctly and check if tools work stable enough.
My tools are launched once a day, so now I will wait.
Thank you.
I will proceed with learning of how to migrate my tools next.
@taavi how to apply this fix to my Toolforge account?
I probably have some old data stuck somewhere.
This is what I'm getting right now:
This is a reminder
Nov 6 2023
Oct 7 2022
is resolved; if there is still some issue please create a new task. Thanks a lot!
My tools are broken for a long time because no one from WMF want to solve problems with their hosting: T295220, T292289.
If you can check and confirm that my programs will work with new system, I may try to migrate.
However, I expect that new system will not bring better level of support (because if depends on humans), so I have doubts about migration even if programs will work.
Nov 6 2021
Language list is moved here:
Oct 29 2021
@aborrero I made a simple test program, which should print MediaWiki version.
Please try to run it from some other tool account: mono WikiTLSTest.exe.
Until success I think that it is too early to set Resolved status.
Source code:
@aborrero look at lines in your quoted logs:
at /build/mono-5.12.0.226/external/boringssl/ssl/handshake_client.c:1132
5.12.0.226 is the system version, right?
Fresh version of mono was needed to debug other issue.
I thought that it was not used in latest runs.
Will recheck and try to disable it.
Oct 22 2021
@aborrero my bots are located at wikitasks tool.
You can test ./run.sh wp_cyrlat and look at cat wp_cyrlat.out | tail -n 30.
Oct 18 2021
@aborrero I think that you are wrong about Resolved status.
I still see Ssl error:1000007d:SSL routines:OPENSSL_internal:CERTIFICATE_VERIFY_FAILED on my bots.
Oct 6 2021
Try parameter --use-angle=d3d9 if you are on Windows. It helps me.
Mar 4 2021
@AntiCompositeNumber, my bot is also having problems again: [2021.03.04 02:00:19] run froze. Need to reopen issue?
Feb 25 2021
My bot run finished successfully a minute ago.
Feb 24 2021
Added debug information:
nonce: xoaxytctpjfyapbpbpqueyefjeqagimc
X-Request-Id: YDZLzE8c0wGk0qURc3HLRAAAAI0
-> fine
nonce: mxtcxoqgqagclbuhujwoxwazjunucbng
X-Request-Id: YDZLzYlPs8Iw@d1YyqpJaQAAAMQ
-> bug
My bot was working for months without such problem.
It is set to pause on errors.
Today I noticed that it was not running for a while.
And found that it froze at 2021.01.15.
Restarted it at Toolforge, same problem happened today (2021.02.24).
Restarted it from my own PC - same error.
Bot is custom, but I think that it is not reusing nonces:
https://github.com/Vort/WikiTasks/blob/dcef5143858b3a4f1fd0dfb0914c536f39dddd2f/MwApi.cs#L110
Dec 20 2020
I have decided to visualize thread count from 43 runs from 2020.11.08 to 2020.12.20.
To make it possible to see how often specific thread count is used, I have added slight random noise.
Results: thread count varies from 10 to 22, most of the time it equals to 14.
Which means that 2g limit is good enough for this specific program.
(if each thread uses 64 MiB of VRAM, then 2g allows to run 32 threads)
Nov 10 2020
Today's result was very strange.
Looks like Mono not fixed problem, but hackfixed it instead:
Scanning page titles[11][11][11][11][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][15][15][15][15][15][15][15][15][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][17][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][20][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][22][11][10][12][12][12][12][12][12][12][12][12][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][10][10][10][10][10][10][10][10][10][10][11][11][11][12][12][12][12][12][12][12][12][12][12][12][12][12][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][15][15][15][15][16][16][16][16][16][16][16][16][16][16][10][10] Done
Nov 8 2020
I'm not sure if it is a better result or not.
Most likely, yes, but more data is needed:
Scanning page titles[11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][13][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][15][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16][16] Done
Nov 7 2020
So looks like I need to build latest Mono at Toolforge.
Nov 6 2020
So looks like I need to build latest Mono at Toolforge.
Again. Some time ago I reverted to "default" mono.
Mono 6.8.0.105 inside my VM:
vort@ubuntu:~/toolforge$ mono ToolforgeDebug.exe [11][11][11][11][11][11][11][11][11][11][11][12][12][12][12][12][12][12][12][12][12][12][12][12][12][12][12][12][12][12][12][12][12][12][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][11][10][10][10][10][11]^C
This is for sure not what happens at Toolforge.
I can't reproduce this locally, for some reason.
Nov 3 2020
It looks like Mono ThreadPool have 3 thread counters:
gint16 num_active = counter._.starting + counter._.working + counter._.parked;
counter._.working should be limited by counter._.working >= counter._.max_working condition.
Also I was not able to find where maximum value for completion_port_threads is used. Maybe Mono just ignores it.
So thread pool should have not more than 4 working threads + have some amount of starting and parked threads.
If working threads count exceeds 4, most likely it is a bug.
More close recreation of what happens in my bot:
mcs ToolforgeDebug.cs -r:System.Net.Http
Let's see how thread count grows with wp_cyrlat code.
I have changed usual dots to thread count in program output:
//Console.Write('.'); Console.Write($"[{Process.GetCurrentProcess().Threads.Count}]");
Here is the result:
Scanning page titles[13][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][14][15][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][19][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21][21] Done
First of all I want to point at one interesting thing:
Look at thread IDs: 23881, 23882, ..., 23891: 11 of them was created sequentially.
Next 3 was created slightly later: 23894, 23898, 23899, next 2 after more delay: 23972, 23973.
And the last id, 24176, differs from previous by 203.
I suspect that ThreadPool.SetMaxThreads do not what I think.
Either I do not understand correctly its functions
Or it have bugs.
Now I change max pool size to be equal to min pool size.
Can't go lower than that with Mono (Windows .NET allows to set 1 / 1).
And add exception handler to all program code:
ThreadPool.GetMinThreads(out workerThreads, out completionPortThreads); ThreadPool.SetMaxThreads(workerThreads, completionPortThreads);
static void Main(string[] args) { LowerThreadPoolLimits(); try { new Program(); } catch { Console.Write($"[ExceptionCaught,ThreadCount:{Process.GetCurrentProcess().Threads.Count},Pausing]"); Thread.Sleep(Timeout.Infinite); } }
Bug do not want to go away.
With pool size = 2 * minimum, another exception happened:
Unhandled Exception: Nested exception detected. Original Exception: at System.Threading.Tasks.Task.ThrowIfExceptional (bool) [0x00011] in <71d8ad678db34313b7f718a414dfcb25>:0 at System.Threading.Tasks.Task`1<string>.GetResultCore (bool) [0x0002b] in <71d8ad678db34313b7f718a414dfcb25>:0 at System.Threading.Tasks.Task`1<string>.get_Result () [0x0000f] in <71d8ad678db34313b7f718a414dfcb25>:0 at WikiTasks.MwApi.PostRequest (object[]) [0x00008] in <6ae7d1ae85b14f5b8c51c0c443295fee>:0 at WikiTasks.Program.GetCyrLat (int,System.Collections.Generic.List`1<string>) [0x0000b] in <6ae7d1ae85b14f5b8c51c0c443295fee>:0 at WikiTasks.Program..ctor () [0x0008f] in <6ae7d1ae85b14f5b8c51c0c443295fee>:0 at WikiTasks.Program.Main (string[]) [0x00001] in <6ae7d1ae85b14f5b8c51c0c443295fee>:0
Nov 2 2020
Am I understanding correctly that this doubles the limit?
I have removed OOM debugging code and added call to this function:
static void LowerThreadPoolLimits() { int workerThreads; int completionPortThreads; ThreadPool.GetMinThreads(out workerThreads, out completionPortThreads); ThreadPool.SetMaxThreads(workerThreads * 2, completionPortThreads * 2); }
So it not becomes a problem on other systems because they do not enforce limits on virtual memory?
What does ThreadPool.GetMaxThreads return?
The number of operations that can be queued to the thread pool is limited only by available memory. However, the thread pool limits the number of threads that can be active in the process simultaneously. If all thread pool threads are busy, additional work items are queued until threads to execute them become available. Beginning with the .NET Framework 4, the default size of the thread pool for a process depends on several factors, such as the size of the virtual address space. A process can call the ThreadPool.GetMaxThreads method to determine the number of threads.
You can control the maximum number of threads by using the ThreadPool.GetMaxThreads and ThreadPool.SetMaxThreads methods.
How many threads are running? There should be not many of them. 5 or so (1 main + 4 workers). Maybe + some threads used by runtime library.
@zhuyifei1999 OOM condition is triggered.
Please look at the grid task #1861734
Oct 28 2020
Is it running?
In this case VMS has nothing to do with actual memory use.
I have added catch (OutOfMemoryException e) to GZipUnpack with Syscall.kill(currentPID, Signum.SIGSTOP); as you said.
And lowered -mem to 1536m.
If my changes are correct (and OOM will happen inside GZipUnpack as usual), then at some time process will stop after OOM triggered.
If you have other ideas, you can test them too (as I said, this bot can be executed as many times as needed for debugging).
So there are no sense in using 512m. Need to try higher values.
My "library" uses async features, which means that C# creates several worker threads, usually equal to CPU core count.
Do this means that this feature automatically allocates core_count * 128 MB memory?
That may be large number. How many cores grid computers have?
While trying to get exception, I have found that exactly the same hang also can happen with 512m.
Will try few more times.
As I understand, 1637509 is unexpected incomplete output, correct?
Oct 27 2020
Ok, I can try this. But, of course, not before exception, but after it.
There are nothing special about the time when it happens.
So there are two possibilities: to pause it at random time, or after exception.
Results should not differ.
Activity of this bot have almost no side effects, so it can be launched at any time and interrupted also at any time.
(It updates table on ruwiki, but if table is already updated, it will do nothing).
Only requirement is to have auth.txt file next to it with OAuth keys of bot account (ConsumerToken, ConsumerSecret, AccessToken, AccessSecret) (lines #22-25 of MwApi.cs).
@zhuyifei1999:
I have tried lower memory settings, 256m:
qstat -j 1637509:
...
hard resource_list: h_vmem=262144k
...
usage 1: cpu=00:13:37, mem=41.73038 GB s, io=0.00016 GB, vmem=55.043M, maxvmem=55.043M
...
Oct 26 2020
Look at attached wp_cyrlat.out.
.[11/61730/273265].[14/59280/265817].[20/61519/276691].[10/59246/261464].[13/62278/273618].[19/61048/271265].[26/52771/256674].[10/63377/277635].[16/61853/273123].[23/59548/278272].[8/64609/282540].[10/61280/265858]
means that program had these states: ..., 11 MB, 14 MB, 20 MB, GC(?), 10, 13, 19, 26, GC(?), 10, 16, 23, GC(?), 8, 10, ...
So temporary objects was cleared every 3-4 iterations.
Thanks. I will try 4g. But it is, of course, not a solution.
If 50 MB program requires 4 GB, then 500 MB program will just fail.
So it is better to find a real cause.
Oct 24 2020
Sep 14 2020
Sep 10 2019
@MusikAnimal what external links are you talking about? Problem is not in external links.
And Pageinfo-header is showing (look at "За последние 30 дней" string).
Bug is just in displaying of charts. This image fails to load:
https://ru.wikipedia.org/api/rest_v1/page/graph/png/Москва/0/b22a95b9d24f05183cbb9bfab90772bb58c13a8c.png
(error: {"type":"https://mediawiki.org/wiki/HyperSwitch/errors/unknown_error","method":"get","uri":"/ru.wikipedia.org/v1/page/graph/png/%D0%9C%D0%BE%D1%81%D0%BA%D0%B2%D0%B0/0/b22a95b9d24f05183cbb9bfab90772bb58c13a8c.png"})
While this one is fine:
https://ru.wikipedia.org/api/rest_v1/page/graph/png/Участник%3AVort%2FЧерновик/0/b22a95b9d24f05183cbb9bfab90772bb58c13a8c.png
Sep 7 2019
@Aklapper here it is: T232254. Please hide it since it contains private data.
When attack started, I was able to access Wikipedia from time to time.
But now only possibility for me to access it is using Tor (my actual location is Ukraine).
If you banned my addresses as security measure, please unban them.
Aug 15 2019
@Wang_Qiliang I don't see application/x-www-form-urlencoded there.
The only noticable thing is that .png is downloaded as image/webp.
Aug 8 2019
T221980 is more like underlying problem (https://ru.wikipedia.org/w/index.php?title=Юй_Чжидин&diff=101429508) than refactoring.
Aug 4 2019
Jul 26 2019
Jul 6 2019
He answered.
It may be two clicks. If user have article A opened, he may copy title of article B, which contained inside A, then go to Q-item of A, add property and paste B title to value field.
I will ask one of the users about scenario, but no much hope on precise answer.
Jul 3 2019
Jun 28 2019
Here is a screenshot of laggy connection:
As you can see, packets are 622 bytes in size, received every second, which means speed of 0.5 KiB/s.
And no retransmissions. Network connection works fine from my side.
Problem was reproduced by me just now, on ruwiki.
Jun 23 2019
Did not read all comments, but want to say that this problem is way older than several weeks.
It occurred for years, but was very rare.
May 6 2019
Feb 6 2019
Jan 18 2019
Jan 14 2019
I thought that results are limited, not requests.
Jan 7 2019
Did anyone actually tested that first paint time increase?
Dec 22 2018
Bug with Q59641456 (part two) is not reproducing anymore.
But empty values still can be seen with such request (part one):
SELECT ?item ?title WHERE { SERVICE wikibase:mwapi { bd:serviceParam wikibase:endpoint "ru.wikipedia.org" . bd:serviceParam wikibase:api "Generator" . bd:serviceParam mwapi:generator "search" . bd:serviceParam mwapi:gsrsearch "intitle:/Дом[-]музей [БК]о/" . bd:serviceParam mwapi:gsrlimit "max" . ?item wikibase:apiOutputItem mwapi:item . ?title wikibase:apiOutput mwapi:title } }
They becomes a problem when I want to fetch sitelinks (result is query timeout or browser tab crash):
SELECT ?item ?lang WHERE { SERVICE wikibase:mwapi { bd:serviceParam wikibase:endpoint "ru.wikipedia.org" . bd:serviceParam wikibase:api "Generator" . bd:serviceParam mwapi:generator "search" . bd:serviceParam mwapi:gsrsearch "intitle:/Дом[-]музей [БК]о/" . bd:serviceParam mwapi:gsrlimit "max" . ?item wikibase:apiOutputItem mwapi:item . } ?sitelink schema:about ?item . ?sitelink schema:inLanguage ?lang }
But I'm not sure if such behavior is wrong.
Dec 16 2018
Oct 10 2018
Sep 7 2018
Can't reproduce anymore.
How to use this feature till new version is released?
Can't rollback to previous - it is blacklisted for some reason.
Are there any nightly builds for AWB?
Sep 6 2018
First report was made ~3 hours ago (13:46 UTC) at ruwiki forum.
Feb 11 2018
Sep 22 2017
would you please expand on what you mean by pressing "quickly" - in relation to how quickly after diff appears / how long keys are held down or something else?
Sep 7 2017
Mar 17 2017
Mar 22 2016
In ru-wiki, this problem hit the Special:Watchlist page.
So I think this bug needs a critical priority.
Dec 9 2015
Also observed with Opera 12.17.