Topics

Web Folders / WebDAV #export #webdav

messydesk
 

Our wiki is quite large (10s of GB), and I gave up on the export.  I created a test wiki with only a megabyte of stuff in it and tried the various export flavors, and none were directly importable into a Mediawiki-based wiki, so I gave up on it.  I used the AnyClient application running on Windows to download all the pages with the Webdav address, then wrote a Python script to find all the internal images referenced in the downloaded pages and then download those.  It took a few hours total for 16 GB.  Discussion board didn't come along with it, however.  I guess the lesson is that if you have a large wiki, don't wait for an export job.

Tom H
 

  1. Awesome! What WS Plan are you on that supports 10s of GB?
  2. I've tried the CyberDuck WebDAV client without success. Curious to see what one gets. I didn't install AnyClient after it said it couldn't find Java.
  3. Did you do the image fetch script because you want to leave many files behind? I thought you could d/l them with the WebDAV client.
  4. You can get Discussions via the API or with the Exports (except the PDF export). Pose your problem to WS Support. Maybe they would make a Discussions only d/l option on the Export page.
  5. If my 56MB Export took 2 minutes to create the ZIP file, linear extrapolation would suggest 16GB could take 10 hours plus download time.

Tom
Looking to move SQLite Tools For RootsMagic from Wikispaces

messydesk
 

1. We're over the limit on the super plan. WS has been nice enough to look the other way,  since we've been paying for a long time. 
2. Cyberduck didn't work for me, either. Anyclient was the only thing that worked. 
3. I did the image download script because many, if not most, images were referenced from the discussion board, and if there wasn't a good way to get the discussion board exported, the images wouldn't do much good. 
4. I started an export of just the discussion board s few days ago. Well see if it ever finishes. Then I'll go through the https and download images I need. Paths to the images will have to change in the HTML, of course. 

Tom H
 

On Mon, Jul 30, 2018 at 09:21 PM, messydesk wrote:
Cyberduck didn't work for me, either. Anyclient was the only thing that worked. 
I went to WS Support who responded 3 days later with the advice:
We would recommend using CyberDuck version 6.1.0.
I had installed 6.6.2, the current version so I uninstalled it and proceeded with 6.1 from https://cyberduck.io/changelog/ . It's over a year old.

Success! I opened the connection using the option "WebDAV (HTTPS)" for secure login to my Wikispaces site.

 
Now to explore...

Tom
Looking to move SQLite Tools For RootsMagic from Wikispaces

Tom H
 

On Thu, Jul 26, 2018 at 07:14 PM, messydesk wrote:
I used the AnyClient application running on Windows to download all the pages with the Webdav address
Did you download the "pages" folder or the "pages_html" folder? On my site, CyberDuck can download the files in the former but sees the files in the latter as being 0 bytes and fails to d/l. Likewise pages in "history" download but not those in "history_html". 
 

Tom
Looking to move SQLite Tools For RootsMagic from Wikispaces

messydesk
 

I downloaded the pages directory, because I wanted the raw data that I convert to Mediawiki format with more scripts I wrote.  The pages_html folder contains 0-byte files, just like you're seeing.

Tom H
 

On Sun, Aug 5, 2018 at 07:57 PM, Tom H wrote:
We would recommend using CyberDuck version 6.1.0.
I had installed 6.6.2, the current version so I uninstalled it and proceeded with 6.1 from https://cyberduck.io/changelog/ 
CyberDuck 6.3.5 is the last version that worked with Wikispaces WebDAV. I don't know if it has any advantage over 6.1 for this purpose.
 
--

Tom
Looking to move SQLite Tools For RootsMagic from Wikispaces