Skip to content

Conversation

@spacerainbow000
Copy link

i made a short, ugly bash script that will create a new HTML file based on the contents of export_saved.html. it parses out links to saved posts, gets the post data in JSON format, writes the data to a local directory, then adds an HREF to the JSON data after the HREF to the actual reddit URL. It probably isn't the best way to do this since it should probably happen inside the actual python script itself (not to mention putting the backed up data inside the main HTML file instead of a directory), but it does the job. a trivial edit that might help make the backup more comprehensive would be to change the cURL request for the JSON data in to a recursive wget with a depth of 1 - this ought to get any images/other files the post may contain, but also might blow up the file size. it would be pretty easy to do tho

@codecov-io
Copy link

Codecov Report

Merging #52 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@          Coverage Diff           @@
##           master     #52   +/-   ##
======================================
  Coverage    94.9%   94.9%           
======================================
  Files           1       1           
  Lines         157     157           
======================================
  Hits          149     149           
  Misses          8       8

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 25eecdd...461da5c. Read the comment docs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants