I’d say roughly 1,000 to 100,000, depending on format.
Edit: Raw ASCII (7-bit) could give you up to ~half a million.
Edit 2: According to Randall Munroe (to lazy to find the source), you could theoretically store one word letter per bit. That would give us up to ten two million books.
Edit 2: According to Randall Munroe (to lazy to find the source), you could theoretically store one word letter per bit. That would give us up to ten two million books.
I don’t see how that is possible, I think it is be one letter per byte.
Bit only represents one state 1 or 0, or true or false. It is too little information to store a letter.
Based on the rates of correct guesses—and rigorous mathematical analysis—Shannon determined that the information content of typical written English was around 1.0 to 1.2 bits per letter.
That’s based on common entropy limits of written information. It’s also why I always Bachelor domestic extended doubtful as concerns at. Morning prudent removal an letters by. On could my in order never it. Or excited certain sixteen it to parties colonel. Depending conveying direction has led immediate. Law gate her well bed life feet seen rent. On nature or no except it sussex.
Of on affixed civilly moments promise explain fertile in. Assurance advantage belonging happiness departure so of. Now improving and one sincerity intention allowance commanded not. Oh an am frankness be necessary earnestly advantage estimable extensive. Five he wife gone ye. Mrs suffering sportsmen earnestly any. In am do giving to afford parish settle easily garret.
That’s bit, a letter or character is a byte (8 bits), this is about right for pure text files that have no overhead, any extra info (like font, size, type, anything except which chatacter…) Is extra bytes, of course.
If we’re only talking 26 letters no caps, we can cut that down to 5 bits. Then use a decent compression algorithm. Someone more bored than I am can do the math.
five bits would only leaves us with six punctuation marks (including spaces, and we don’t get any numerals either) though, do you think that’s enough? i certainly don’t; i have not even used a full stop and I have already exceeded it!
UTF-8 and ASCII are normally already 1 character per byte.
With great file compression, you could probably reach 2 characters per byte, or one every 4 bits.
One character every bit is probably impossible. Maybe with some sort of AI file compression, using an AI’s knowledge of the English language to predict the message.
Edit: Wow, apparently that already exists, and it can achieve even higher of a compression ratio, almost 10:1! (with 1gb of UTF-8 (8 bit) text from Wikipedia)
bellard.org/nncp/
If an average book has 70k 5 character words, this could compress it to around 303 kb, meaning you could fit 1.6 million books in 64 gb.
You can get a 2tb ssd for around $70. With this compression scheme you could fit 52 million books on it.
I’m not sure if I’ve interpreted the speed data right, but It looks like it would take around a minute to decode each book on a 3090.
It would take about a year to encode all of the books on the 2tb ssd if you used 50 a100s (~$9000 each). You could also use 100 3090s to achieve around the same speed (~$1000 each)
52 million books is around the number of books written in the past 20 years, worldwide. All stored for $70 (+$100k of graphics cards)
Have a couple old pirated e-textbooks as .pdf files on my PC from uni, several hundred pages with color images, and they are mostly under 50MB, averaging about 30MB. 1GB is a little over a thousand MB (1024) so 1 would maybe hold a bit under 50 or so each? So times 64 that, a hell of a lot. Several thousand total, at least, as size varies.
No actually I agree with you, plus PDF is a privacy and security nightmare allowing for arbitrary JavaScript execution by default if you don’t limit its permissions. but unfortunately it was all that I could find of that ISBN so it was either shut up and use them or pay full price for my textbooks, so, y’know, I sort of grit my teeth and went with it.
yeah i read mostly sci fi books so around like 300-400 pages all text and i’d say the average e-book for them is like 150-200kb’s so if it were books like that you’d be looking at stuffing like 300,000 books on there.
I might be dumb but how many books would 64gbs mean
I’d say roughly 1,000 to 100,000, depending on format.
Edit: Raw ASCII (7-bit) could give you up to ~half a million.
Edit 2: According to Randall Munroe (to lazy to find the source), you could theoretically store one
wordletter per bit. That would give us up totentwo million books.I don’t see how that is possible, I think it is be one letter per byte.
Bit only represents one state 1 or 0, or true or false. It is too little information to store a letter.
Here ya go:
https://what-if.xkcd.com/34/
That’s based on common entropy limits of written information. It’s also why I always Bachelor domestic extended doubtful as concerns at. Morning prudent removal an letters by. On could my in order never it. Or excited certain sixteen it to parties colonel. Depending conveying direction has led immediate. Law gate her well bed life feet seen rent. On nature or no except it sussex.
Of on affixed civilly moments promise explain fertile in. Assurance advantage belonging happiness departure so of. Now improving and one sincerity intention allowance commanded not. Oh an am frankness be necessary earnestly advantage estimable extensive. Five he wife gone ye. Mrs suffering sportsmen earnestly any. In am do giving to afford parish settle easily garret.
Smart compression!
That’s bit, a letter or character is a byte (8 bits), this is about right for pure text files that have no overhead, any extra info (like font, size, type, anything except which chatacter…) Is extra bytes, of course.
If we’re only talking 26 letters no caps, we can cut that down to 5 bits. Then use a decent compression algorithm. Someone more bored than I am can do the math.
five bits would only leaves us with six punctuation marks (including spaces, and we don’t get any numerals either) though, do you think that’s enough? i certainly don’t; i have not even used a full stop and I have already exceeded it!
One letter per bit? You’d need some crazy effective compression algorithm for that, because a bit is 1 or 0. Did you mean byte?
UTF-8 and ASCII are normally already 1 character per byte. With great file compression, you could probably reach 2 characters per byte, or one every 4 bits. One character every bit is probably impossible. Maybe with some sort of AI file compression, using an AI’s knowledge of the English language to predict the message.
Edit: Wow, apparently that already exists, and it can achieve even higher of a compression ratio, almost 10:1! (with 1gb of UTF-8 (8 bit) text from Wikipedia) bellard.org/nncp/
If an average book has 70k 5 character words, this could compress it to around 303 kb, meaning you could fit 1.6 million books in 64 gb.
You can get a 2tb ssd for around $70. With this compression scheme you could fit 52 million books on it.
I’m not sure if I’ve interpreted the speed data right, but It looks like it would take around a minute to decode each book on a 3090. It would take about a year to encode all of the books on the 2tb ssd if you used 50 a100s (~$9000 each). You could also use 100 3090s to achieve around the same speed (~$1000 each)
52 million books is around the number of books written in the past 20 years, worldwide. All stored for $70 (+$100k of graphics cards)
There’s something comical about the low low price of $70 (+$100k of graphics cards) still leaving out the year of time it will take.
Well I guess you could sacrifice a portion for an index system and just decode the one you’re trying to read
Not dumb to ask
Have a couple old pirated e-textbooks as .pdf files on my PC from uni, several hundred pages with color images, and they are mostly under 50MB, averaging about 30MB. 1GB is a little over a thousand MB (1024) so 1 would maybe hold a bit under 50 or so each? So times 64 that, a hell of a lot. Several thousand total, at least, as size varies.
PDF is super overkill for ebooks. Mobi or epub are usually <5mb per book (usually around 1mb)
No actually I agree with you, plus PDF is a privacy and security nightmare allowing for arbitrary JavaScript execution by default if you don’t limit its permissions. but unfortunately it was all that I could find of that ISBN so it was either shut up and use them or pay full price for my textbooks, so, y’know, I sort of grit my teeth and went with it.
More than have been banned, I think.
a shitload. 64000 if it were simple text only stuff with 1MB per book, 640 if it were 100MB chonkers full of images
yeah i read mostly sci fi books so around like 300-400 pages all text and i’d say the average e-book for them is like 150-200kb’s so if it were books like that you’d be looking at stuffing like 300,000 books on there.