- pdurbinThanks!
- Mmih: I hate it when our APIs break. 😠 Can you please open an issue about this at https://github.com/IQSS/dataverse/issues so we can at least document the change properly? 🙏
- mih
- pdurbinMmih: thank you!!
- Heh. 🍻. So you switched to a different API? That's fine. Another fix might be this: https://github.com/gdcc/pyDataverse/pull/145 (adding application/json as a header)
- But that was an old backward-incompaible change. It seems like yours might be newer. You're pretty sure it worked fine with 5.12?
- I should see ifJJan Range has time to take a look at this.
- Don SizemoreD@pdurbin I have something of a conundrum if you have a minute at some point today
- pdurbinDDon Sizemore: the node version thing?
- Don SizemoreD@pdurbin yeah, I think I know how I'm going to work around it, I just wish things were cleaner.
- pdurbinwhat do they call that thing, curl shell
- Don SizemoreDI was discouraged from using that in Dataverse-Ansible, many years ago ;)
- shebang
- pdurbinyeah, but more specific
- Don SizemoreDRCE?
- pdurbincurl pipe bash?
- pipe installers?
- anyway, yeah, probably an antipattern
- sythemeta847 joined the room
- prsridha joined the room
- prsridhaPHello, I seem to be having an issue with delete - when I select less than a page of files and click "delete," the deletion is completed successfully. However, when I try to delete all 426 files in the dataset by selecting "Select all 426 files in this dataset" and clicking "delete," nothing happens. Could you let me know how to go about this?
- pdurbinPprsridha: hmm. I'm asking in Slack. Not sure, to be honest. I suppose you could write a script to delete them in batches.
- pdurbinWhat version of Dataverse are you running?
- And can you reproduce it at https://demo.dataverse.org ?
- therucidlacey joined the room
- Roy Pardee (KaPoW) joined the room
- Roy Pardee (KaPoW)Kif I have a pretend dset in demo.dataverse.org, and click the 'contact owner' button, I should get an e-mail at the address I registered as owner of said data, right?
- juliangautierJHeyaKRoy Pardee (KaPoW): The email address in the metadata field called Point of Contact E-mail field will get any emails sent through that Contact Owner form
- Roy Pardee (KaPoW)K
- ThanksJjuliangautier ! So that should be me. I wonder if it's getting caught in a spam-trap or something...
- Mostly I just wanted to (dis)confirm that the demo site does indeed send e-mails.
- juliangautierJYeah it does. I just tried and got the email in my inbox. There's a known issue with emails sent through Dataverse being flagged as spam. But I'm not sure of the technical details enough to say if this is related
- At Harvards repo we seem to be reevaluating something related to what's described in https://github.com/IQSS/dataverse/issues/4580
- * At the Harvard Dataverse we seem to be reevaluating something related to what's described in https://github.com/IQSS/dataverse/issues/4580
- Roy Pardee (KaPoW)KCool cool--thanks!
- One other question--possibly out of scope here. If my org is super-paranoid about actually putting substantive data out of our local control, would it be acceptable use to use dataverse to register the existence & metadata of various potentially-shareable-under-the-right-circumstances datasets, but not actually upload that data? The idea being that candidate users will use the 'contact owner' function to direct requests to us for processing?
- pdurbinKRoy Pardee (KaPoW): sure, an example is this dataset of Facebook URLs where the data itself is hosted at Facebook. We sometimes call this a "metadata only" dataset (or record): https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/TDOAPG
- poikilotherm (CEST,UTC+2)KRoy Pardee (KaPoW): to extend a bit on this: making data FAIR does not require public availability of data. Metadata publications are pretty common and one of the usual use cases of Dataverse. It usually is called a "registry" in these cases instead fof "repository". In our installation, we try to focus on the metadata and keep the data out - mostly not because of paranoia but because our institutes very often do not get storing the data longterm wrong and we see not much use in duplicating the data. Also, we have a lot of data that is rather large (TB/PB) and that is simply impossible to have a copy of. In these cases, metadata publications make up a pretty good alternative, as the data is at least FAIR and referencable.
- *KRoy Pardee (KaPoW): to extend a bit on this: making data FAIR does not require public availability of data. Metadata publications are pretty common and one of the usual use cases of Dataverse. It usually is called a "registry" in these cases instead fof "repository". In our installation, we try to focus on the metadata and keep the data out - mostly not because of paranoia but because our institutes very often do not get storing the data longterm wrong and we see not much use in duplicating the data. Also, we have a lot of data that is rather large (TB/PB) and that is simply impossible to have a copy of. In these cases, metadata publications make up a pretty good alternative, as the data is at least FAIR and referencable.
- *KRoy Pardee (KaPoW): to extend a bit on this: making data FAIR does not require public availability of data. Metadata publications are pretty common and one of the usual use cases of Dataverse. It usually is called a "registry" in these cases instead fof "repository". In our installation, we try to focus on the metadata and keep the data out - mostly not because of paranoia but because our institutes very often do not get storing the data longterm wrong and we see not much use in duplicating the data. Also, we have a lot of data that is rather large (TB/PB) and that is simply impossible to have a copy of. In these cases, metadata publications make up a pretty good alternative, as the data is at least FAIR and referencable.
- (We actually did take this one step further and added a new field type to allow real URLs. This is a fork... https://data.fz-juelich.de/guide/juelich/data-linking.html)
- prsridhaP
- Thanks for looking into it! This is how it's looking from my end. I have deleted the files manually for now.
- pdurbinPprsridha: manually? In smaller batches? One at a time? Using the API? Sorry, I'm not sure what you mean. 😅 Thanks for the screenshot! Is that from the demo server? Or your own installation?
- Pallinger PéterPHello! One of our test servers fails to run the dataverse installer (I tried to install v5.13), due to
"Internal Exception: java.sql.SQLException: Error in allocating a connection. Cause: Access denied to execute this method : setURL" - I am a bit stumped to how to debug this issue. I could just reformat the whole VM, but I would like to know why this problem occurs...
- It would seem that the sql password is bad, but it is the same as on out other test server, which is working fine...
- First, I would like to know how the POSTGRES_PASSWORD in the default.config is used... How can I check whether it is set correctly?
- * First, I would like to know how the POSTGRES_PASSWORD in the default.config is used... How can I check whether it is set correctly in payara?
- Pallinger PéterPOr could the cause possibly be that setURL is an sql function that is not defined in the current database?
- pdurbin
I only see
setURL
once in the codebase and it's not a SQL thing:$ ack setURL src src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java 1609: public String getReturnToDatasetURL(String serverName, Dataset dset) {
- Pallinger PéterPI think it is in eclipselink, org.eclipse.persistence. ....
- * I think it is in eclipselink, org.eclipse.persistence. .... I dug somewhat into the source code after examining the stack trace...
- I would seem that I somehow misset the SQL connection parameters, but I am still not sure what parameters the server uses. The postgres server itself does not report any connection attempts in the log, although I dialed up the log level to DEBUG1...
- Which would point to the server adress to be misset, but it seems correct in the domain.xml (is that the correct place for sql connection params like dataverse.db.host, not?).
- * Which would point to the server adress to be misset, but it seems correct in the domain.xml (is that the correct place for sql connection params like dataverse.db.host?).
- I can manually connect to postgres with psql, giving the apparently same parameters that dataverse is set to use...
- pdurbinOk. So you're saying if I use the wrong password for postgres I should expect this error?
- Pallinger PéterPmaybe, I will try to set the wrong password on my other test server...
- Pallinger PéterPUnfortunately, the password error is different:
FATAL: password authentication failed
for user "dvnuser" - so, I am again stumped about the cause of the above setURL error...
- pdurbinAny ideas,Opoikilotherm (CEST,UTC+2) ?
- poikilotherm (CEST,UTC+2)setURL is coming from the underlying JDBC class and refers to some wrong JDBC URL
- Please check your credentials
- AFAIK the installer is battle tested to create the aliases etc
- But of course something might have gone wrong there
- pdurbincontainerization working group meeting in 5 minutes: https://ct.gdcc.io
- Miguel SilvaAs soon I finish prototyping my smart DAQ with swarm capabilities I intend to take a peak at the dataverse source code to propose some improvements in regards to live data acquisition and open science ... Almost there
In reply to
pdurbincontainerization working group meeting in 5 minutes: https://ct.gdcc.io - Here's the newest smart DAQ with SWARM capabilities
- Available on my public GitHub profile to anyone use
- ( Open hardware with CC licensing )
- @archive:matrix.org joined the room
- pdurbinNew logs for this channel! Check 'em out! https://archive.matrix.org/r/dataverse:matrix.org
- pdurbinI just switched us over: https://github.com/IQSS/chat.dataverse.org/pull/24
- poikilotherm (CEST,UTC+2)Fancy pants!
- pdurbinI know right
- pdurbinMy daughter is stage crew for her 8th grade musical, Finding Nemo Jr., premiering two hours from now! I'm off! Have a great weekend! 🐟️
- imlostlmao joined the room
- tosigus joined the room
- Don SizemoreD@poikilotherm
dataverse.files.directory
is only used for temporary files these days, correct? I'm asking because I ran acrosstarget/classes/META-INF/microprofile-config.properties:dataverse.files.directory=/tmp/dataverse
- poikilotherm (CEST,UTC+2)No it's not!
- There is a list in the docs where this stuff is used
- (I asked myself the same thing when going for Mpconfig)
- This is a sane default for most installations and development
- Btw this directory is the default which had been present in the code before as well
- pdurbina pre-existing condition 🤒
- Don SizemoreD@poikilotherm if u hapi i'm hapi, just double-checking