Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

backup data in JATOS

Hey so I have a problem on my JATOS server where when I try to access data from a recent, large online study then the server crashes and needs to be restarted, preventing me from seeing or accessing my collected data. I have been told that this crash may be the result of a bug in JATOS and that updating JATOS may be the way to resolve this problem. However, it is also strongly recommended that I backup all my data before updating JATOS as there is a chance it will be lost.

So here is the conundrum - I cant access my data unless I update JATOS and I cant update JATOS until I backup my data and I cant backup my data until I access my data...

Any recommendation? Should I just roll the dice and hit update? Any help is appreciated!


  • Hi dockasaurusrex,

    Although I did the update on my JATOS server a hundred times without any problems I would not just do it without a backup of my precious data just in case the 101st time will fail.

    Yes, there is this issue with JATOS up to version 3.3.6 where large result data sets can crash the server because they run out of memory. The problem was that if you try to export all your result data at once, internally JATOS first loads them into memory and only then sends them to the client. This leads crashes due to low memory. The problem is mostly solved from version 3.4.1 on. The data is streamed directly from the database to the client without loading them all in memory.

    Comes to your conundrum, how to up update without loosing data. I can see two possibilities: 1) Export the result data in batches (instead of all at once), or 2) If you have access to the server and know you way around a terminal you can backup the files manually.

    1) Instead of clicking Export Results -> All try to select only a couple of results at once and export them via Export Results -> Selected. This way you will take the strain off your JATOS' memory. And you can still export them step-by-step in batches.

    2) A manual backup isn't too difficult if you know a bit of terminal. But first: You have to stop JATOS before doing any backup - otherwise you can corrupt your database and loose all your data. Then, basically all data are stored in a couple of folders residing in your JATOS installation folder. Those are: study_assets_root, study_logs and database (if you didn't set up your own MySQL). What I usually do is just make a copy of the whole JATOS installation folder as a backup. Then if somethings goes wrong with the update your can just overwrite the bad JATOS with the copy.

    There is also a page in the docs about it:

    But don't forget to stop JATOS before you copy any folders.

    I hope it works for you :)


  • Hey Kristian,

    Thanks a load! Was able to update JATOS but it seems that I still cannot access my data. My experiment has 150 real participants but the data folder has over 400 results, seeming to suggest that about 250 people started the program and dropped out. In any case every person's data was saved in a file which probably adds up to a huge dataset.

    I dont think there is any problem with exporting in batches, the problem is that I never even have the opportunity to export as when I select 'results' under the experiment tab then JATOS works for a moment then displays the message 'Cannot read results data.'

    Any ideas? Would be a bummer if I lost 150 participants worth of data

  • Hi dockasaurusrex,

    When you say

    every person's data was saved in a file

    do you mean, you used the jatos.submitResultData or jatos.appendResultData functions to send result data back to JATOS? Or did you actually upload files with jatos.uploadResultFile? The latter one is only available since 3.5.1 and since you said you just upgraded, it's unlikely you used jatos.uploadResultFile. I'm just confused.

    But I can completely understand you, loosing 150 participant's data is a bad.

    Does the same error happen if you try one of the other result pages? I assume you click on the 'Results' button in the study toolbar, but there are others: each component has one, and going to the 'Worker & Batch Manager' there are ones for each batch and for each worker. Do they all show 'Cannot read results data.'?

    And a second questions: What does JATOS' log say? You can (if your JATOS user is an admin user) see the logs by going to my-jatos-domain/jatos/log. Interesting is especially the logs during you try to access the results page. Can you please post them here?



  • Hey Kristian,

    Thanks again for getting in contact, Ive been be to make some progress by following your advise, I am now able to display my data files, the way that Ive been able to do this is a bit convoluted but ill say how below but it may not necessary to answer my question which i will post even further below:

    So I ran my experiment over prolific, and I passed my participants the MTurk worker login so that tey would be registered as MTurk workers rather than general multiple workers (GMW). This allowed me to pass on their prolific subject ID's so i could track them, something I couldn't figure out how to do using the GMW id's. I see that now that I've updated this method is no longer possible. After updating I wax able to look at my data and everything seemed fine so I ran another batch of participants, after this then I again could not access my data (it was then that I wrote my previous message). I see that after the latest update my new participants were registered as GMWs, which is fine but less than ideal. given that my new participants are a different worker class than my old participants then I was able to access only my new data using the worker and batch manager link. This way my data was effectively chunked and it seems that JATOS can handle the data.

    I was able to export my new GMW worker data fine but my MTurk data continues to have problems. Specifically there are a large number of participants whose data I cannot export. I have tried exporting them one-by-one and on certain subjects I get an error stating 'undefined.' I have ripped the log code for you here:

    2020-03-18 13:29:14,928 [WARN] - o.h.e.l.i.CollectionLoadContext - HHH000160: On CollectionLoadContext#cleanup, localLoadingCollectionKeys contained [1] entries
    2020-03-18 13:29:14,927 [WARN] - o.h.e.l.i.LoadContexts - HHH000100: Fail-safe cleanup (collections) : org.hibernate.engine.loading.internal.CollectionLoadContext@253cb728

    this is what I see when I try to open the files which return 'undefined.' This happens to be the same log that I see when I try to access the data on the results tab and get the error 'cannot read results data' which I reported earlier.

    what do you think? any hope?

    Thanks for your time!

  • Hi, this is not directly related to your question (that Kristian will try to answer later) but for future reference, we recently updated the docs to explain how to pass query parameters (with the prolific ID) to JATOS. You need to do that from Prolific, explained here:

    Hope this helps for the future



  • Thanks Elisa! V helpful

  • Hey yall, just an update on my end:

    While initially I was able to see my data, after trying to download it I can no longer see it if I navigate away from the results page and try to come back, seeming t suggest that the act of exporting bugs the data files in such a way that JATOS rejects them outright. Furthermore, I share my JATOS server with another researcher who also has large datasets and when trying to export his results the same ting happens (initially viewable under results tab, error when attempting to export followed by inability to view data under results tab). As a result all large datasets in our JATOS are inaccessible. The error message in the log is the same as above.

    Any thoughts? Best,

  • And just another update:

    It seems I have been able to solve the problem through the combination of two things: restarting the server that my JATOS data is stored on and exporting in batches. So firstly I found that by restarting my JATOS server then I would be able to view my data again in the results tab so that was my first step. Secondly, if I tried to export too large a batch of participants then an error would occur and I would not be able to see my data until I restarted the server again.

    In conclusion, when using JATOS you should take care to download your data in small batches (i used 25 subjects at a time as this seemed to be my limit) and if there are issues then you should try a hard reset and see if a reboot sets everything back in order... just like all technologies I suppose.

  • Hi dockasaurusrex,

    I haven't forgotten you. Yesterday was just a busy day. I still try to adapt to the new corona situation here.

    Although you seem to have figured out a workaround, I still try to figure out what went wrong on on your JATOS.

    This warn message in your logs

    2020-03-18 13:29:14,928 [WARN] - o.h.e.l.i.CollectionLoadContext - HHH000160: On CollectionLoadContext#cleanup, localLoadingCollectionKeys contained [1] entries
    2020-03-18 13:29:14,927 [WARN] - o.h.e.l.i.LoadContexts - HHH000100: Fail-safe cleanup (collections) : org.hibernate.engine.loading.internal.CollectionLoadContext@253cb728

    seems to indicate that something did crash beforehand and now the database has to do some cleanup. I've never had this before.

    Let me run some test on my JATOS server. So just to be sure, I have a couple of questions:

    • You are using v3.5.1, right?
    • Do you use an external MySQL?
    • What are the specs of your server, especially disk size and memory
    • What is the average size of a single result data?



  • Hi dockasaurusrex,

    I did some tests with my JATOS server and in short: I could reproduce your issue. Adding more memory solved it.

    So in detail, I run a load test:

    • JATOS running on Digital Ocean on 1 CPU, 1 GB memory, 25 GB disk
    • around 10,000 study runs
    • each stores 2 x 10kB of result data in JATOS -> altogether around 200MB

    The test itself run flawlessly and not a single request failed. But then, clicking in JATOS' GUI on Results resulted in either an error message or the 'Please wait' box never disappeared. In the logs were tons of errors 'Couldn't get result':

    2020-03-21 20:41:57,831 [WARN] - s.g.ResultService - Couldn't get result
    2020-03-21 20:41:57,831 [INFO] - o.h.e.i.DefaultLoadEventListener - HHH000327: Error performing load command : org.hibernate.exception.GenericJDBCException: could not extract ResultSet
    2020-03-21 20:41:57,831 [ERROR] - o.h.e.j.s.SqlExceptionHelper - The database has been closed [90098-193]
    2020-03-21 20:41:57,831 [WARN] - o.h.e.j.s.SqlExceptionHelper - SQL Error: 90098, SQLState: 90098

    Then I gave the machine on Digital Ocean more memory. I had 1 GB (the absolute minimum on DO) and tried it first with 4 GB and it worked again. Then I tried with 2 GB and it worked too. So, that means, I was able to see the Results and I could download them all at once (all 10,000 without doing it in batches).

    So one easy solution is to increase the memory of your server. You could even do it only for the time of the experiment and the rest of time run JATOS on minimal setup. 



  • I ran into the same problem. Exporting 500 datasets was fine, but with the final 1200 datasets it got stuck when trying to export the Results from JATOS.

    As a workaround I copied all JATOS directories from the server to a local directory on my PC and was able to open and export them locally.

    Phew, for a minute I thought I lost my data there!

  • That is always a possibility. Your laptop has more memory. JATOS embedded database stores everything in the folder /database. So copying this folder (along with /study_assets_root and maybe /result_uploads if you have file uploads) to a different JATOS installation should do the job. Just remember to always turn off JATOS before copying the /database folder, otherwise it can get corrupted.

  • Hey Kristian,

    Thanks for the help! I will definitely use your work around in the future. Your service is invaluable given the current crisis, keep up the good work!

Sign In or Register to comment.

agen judi bola , sportbook, casino, togel, number game, singapore, tangkas, basket, slot, poker, dominoqq, agen bola. Semua permainan bisa dimainkan hanya dengan 1 ID. minimal deposit 50.000 ,- bonus cashback hingga 10% , diskon togel hingga 66% bisa bermain di android dan IOS kapanpun dan dimana pun. poker , bandarq , aduq, domino qq , dominobet. Semua permainan bisa dimainkan hanya dengan 1 ID. minimal deposit 10.000 ,- bonus turnover 0.5% dan bonus referral 20%. Bonus - bonus yang dihadirkan bisa terbilang cukup tinggi dan memuaskan, anda hanya perlu memasang pada situs yang memberikan bursa pasaran terbaik yaitu Bola168. Situs penyedia segala jenis permainan poker online kini semakin banyak ditemukan di Internet, salah satunya TahunQQ merupakan situs Agen Judi Domino66 Dan BandarQ Terpercaya yang mampu memberikan banyak provit bagi bettornya. Permainan Yang Di Sediakan Dewi365 Juga sangat banyak Dan menarik dan Peluang untuk memenangkan Taruhan Judi online ini juga sangat mudah . Mainkan Segera Taruhan Sportbook anda bersama Agen Judi Bola Bersama Dewi365 Kemenangan Anda Berapa pun akan Terbayarkan. Tersedia 9 macam permainan seru yang bisa kamu mainkan hanya di dalam 1 ID saja. Permainan seru yang tersedia seperti Poker, Domino QQ Dan juga BandarQ Online. Semuanya tersedia lengkap hanya di ABGQQ. Situs ABGQQ sangat mudah dimenangkan, kamu juga akan mendapatkan mega bonus dan setiap pemain berhak mendapatkan cashback mingguan. ABGQQ juga telah diakui sebagai Bandar Domino Online yang menjamin sistem FAIR PLAY disetiap permainan yang bisa dimainkan dengan deposit minimal hanya Rp.25.000. DEWI365 adalah Bandar Judi Bola Terpercaya & resmi dan terpercaya di indonesia. Situs judi bola ini menyediakan fasilitas bagi anda untuk dapat bermain memainkan permainan judi bola. Didalam situs ini memiliki berbagai permainan taruhan bola terlengkap seperti Sbobet, yang membuat DEWI365 menjadi situs judi bola terbaik dan terpercaya di Indonesia. Tentunya sebagai situs yang bertugas sebagai Bandar Poker Online pastinya akan berusaha untuk menjaga semua informasi dan keamanan yang terdapat di POKERQQ13. Kotakqq adalah situs Judi Poker Online Terpercayayang menyediakan 9 jenis permainan sakong online, dominoqq, domino99, bandarq, bandar ceme, aduq, poker online, bandar poker, balak66, perang baccarat, dan capsa susun. Dengan minimal deposit withdraw 15.000 Anda sudah bisa memainkan semua permaina pkv games di situs kami. Jackpot besar,Win rate tinggi, Fair play, PKV Games