Hi! Template:Number of main statements by property (and similar templates) have last been updated on 27 June. Will it be updated soon? This template is indeed very useful for historical reference. Thanks!
User talk:DeltaBot
Jump to navigation
Jump to search
Yes something broke here. Need to investigate, as the error that I am seeing might relate to a temporary problem with the query service.
Earlier today an update of these lists has been successful.
As much as I am aware, the code technically works, but IO issues with the query service make the task somewhat vulnerable. In essence, there are ~50.000 queries to WDQS (over ~36 hours) to be run per task execution, and if the query service is not available at some point during that phase, the entire task result is lost. Maybe some more caching and handling of connection errors would be helpful here, but I need to find some time to get this done ...
Hello,
would it be possible to cross references disambiguation objects to the corresponding family name objects like for example in
- d:Q108560671 <--> d:Q105537041
Thanks a lot!
Plenty of things are certainly possible, but I do not have time to write more scripts at the moment (and this will not change any time soon). I am just trying to keep the current lineup of DeltaBot jobs alive.
Don't you have a bot account as well?
No, I dont have a bot account.
Oh, now I'm surprised. Don't you have a Github account with plenty of automatable Perl scripts? Or was that someone else?
The scripts depend on the results of tools like PetScan, Wikidata SPARQL-Query-Service, HarvestTools, ... as input and mainly convert, prepare and create statements for QuickStatement, but can not be run as bot without modification. Therefore the scripts have to be started manually, the single steps and (intermediate) results can be checked by the person running the scripts before executing the QuickStatements.
Hello, it seems that DeltaBot has stopped creating disambiguation objects on 4th of August 2024:
Currently there are 23 unconnected disambiguation pages in the german language wikipedia:
- https://petscan.wmcloud.org/?psid=29050009&al_commands=P31%3AQ4167410
- https://de.wikipedia.org/wiki/Spezial:Nicht_verbundene_Seiten?namespace=0
Maybe DeltaBot needs to be restarted?
Thanks a lot!
Indeed, it had crashed but has been restarted to catch up
https://www.wikidata.org/w/index.php?title=Q9359273&diff=prev&oldid=2193016066
Another wrong merge: Urmensch (Q15852236), definition: Der frühere Mensch = Wikimedia disambiguation page.
The bot reacts to sitelink modifications by other users. If those are wrong, the bot makes incorrect edits. Both situations are resolved now.
The bot should stop performing such obviously wrong merges.
Given how low the error rate is, I am not going to change anything.
What is the error rate?
Low. This is the first complaint in a long time, although the bot is pretty active with this task.
Do you have a number for the rate?
No instantaneous numbers available; since I am currently away on vacation, I do not have equipment available to calculate one within the next days. However, all data necessary to estimate an error rate is publicly available, thus feel free to compute this by yourself.
Related: https://www.wikidata.org/w/index.php?title=Q60409524&diff=prev&oldid=2214215734
@Kolja21: maybe DeltaBot should take into consideration the type in general, merging name and family not so good. https://www.wikidata.org/w/index.php?title=Q56539311&action=history ... And maybe even if type matches, e.g. two human items, who knows if the rest belongs to the merge target?
+1. Merging a Familienname (Q101352) with a single Familie (Q8436) should not happen.
The source code for the involved bot jobs are here: https://github.com/MisterSynergy/deltabot-scripts/blob/master/incomplete_mergers/incomplete_mergers.py and https://github.com/MisterSynergy/deltabot-scripts/blob/master/missing_redirect/missingRedirect.py. The jobs run hourly at :45 and :35, respectively.
Feel free to offer improvements. There are currently a couple of checks about the state of the involved items taking place, but nothing that is domain specific. Generally the jobs work pretty with very little complaints, as they are all follow-up actions on previously incomplete (mostly) human edits.
Are there lists of edits for each job?
No.
The source code for all jobs is at GitHub (https://github.com/MisterSynergy/deltabot-scripts), and job status and execution schedules can be found at https://k8s-status.toolforge.org/namespaces/tool-deltabot/.
There are also a couple of log files in case a job crashes, but there is not a full list of edits per job. You need to infer from execution time and performed action what was done.
Adding "topic's main template = Template:School Districts in Maryland" to school district in the United States (Q15726209)
See https://www.wikidata.org/w/index.php?title=Q15726209&action=history
Q20325506#P1423 is the reason.
Hi.
First of all, many thanks for the patient and useful work.
Could you please avoid creating new elements for en:Wikipedia pages whose title ends with "(disambiguation)" or fr:Wikipedia pages ending with "homonymie"? Just an example: Q126023680 (Cayton) created with Q29597956 already existing. Adding the Wikipedia link to the proper Wikidata element would avoid manual merges.
I hope my request is clear and this is not too much work to update.
One more request: could you please add an accent to the word "Wikimedia" in new_disambiguation_pages.py? Wikimédia is the French translation.
Line 65:
{
'language': 'fr',
'site': 'frwiki',
'project': 'wikipedia',
'category': 'Homonymie',
'description': 'page d\'homonymie de Wikimedia',
}
Hi.
Is it too complex or are my requests unclear?
It was simply forgotten, I am sorry.
The French translation is fixed, thank you for the input.
As for the other request… From experience, it is often better to import the sitelink/item anyways even if there is a possiblity that it results in a duplicated item that needs to be merged, rather than to leave it alone and hope that someone or some other bot picks it up. I would prefer to keep it as is.
I noticed that a lot of the same people end up nominating duplicate items for deletion. I know this bot leaves a comment on the deletion request closing it and directing the nominator to the merge help page, but I think very few people are actually reading them because the deletion requests are archived very soon after. Would it be possible to make the task also leave a message on the nominator's talk page? Limiting it to only leaving a message once per user as not to inundate them with multiple messages.
Thank you for the input.
I think this should be done by involved users, rather than by a bot. Usually these such hints trigger follow-up questions that a bot cannot answer, and I have no capacities to do this by myself.
Thus, I would recommend to approach users by yourself if you deem this to be necessary. It would probably be much more helpful.
Thanks for taking the time to reply. I'll definitely think about the personal approach.
I notice that DeltaBot has recently made changes to DrugBank ID, such as [https://www.wikidata.org/w/index.php?title=Q126500299&curid=120522144&diff=2176920362&oldid=2176535167 here], that break the external link. In this example, the working link to https://go.drugbank.com/drugs/DB06592 has become a broken link to https://go.drugbank.com/drugs/06592. Can someone please have a look? Thank you.
Seems the format of the identifier has recently been changed. There is a fixClaims job defined on User:DeltaBot/fixClaims/jobs which needs to be adapted as well if this is a persistent change. Identifier format changes are usually bad practice, though, but I am not sure what the background in this situation is.
I have recently updated all the DrugBank IDs to match the correct format using QuickStatements, but today I've noticed that your bot reverted all my edits. Correct format should include "DB" prefix and User:DeltaBot/fixClaims/jobs have to be updated to reflect this.
I have removed the job completely.
I had carefully corrected (Xavier Stouff (Q3570797) as Jules Tannery and Paul Appell are NOT his PhD advisors and I explained in detail the situation on the discussion page (I also gave serious references, which math Genealogy is not). DeltaBot has put their names back as PhD advisors. I have now corrected the pages of Tannery and Appell (as far as Stouff is concerned, in fact, the other names are probably also false) and thus I will again correct Stouff's page. Best
Hi, Deltabot starts updating the "Humans with missing claims" pages and then hangs on one page. Now P1340 was the last, [last week it was P2190.
While I'm here, I'll ask: could the update be scheduled for Friday? At the end of the week, there is more time to repair the items.
Thanks if you do
Will look into this later this week.
Seems there is some read timeout when the bot interacts with the Mediawiki API (via pywikibot), which apparently ultimately results in an edit conflict and crashes the bot … (?!)
No idea what is going on here to be honest, and I think I might have seen these read timeouts in some of my other bots as well. Needs further investigation for sure.
Anyways, I have rescheduled it to be run on Fridays at 15:50 UTC instead of Sundays at the same time.
Thanks for rescheduling, we'll see if it runs on Friday.
Hello MisterSynergy! Now stopped at P1006.