Posts tagged ‘technology’

2011-02-13

Installing LaTeX packages in your home directory

I was recently in a position where I wanted to use a LaTeX package, but it was not installed on the computer system I was working on. The computer system was at my workplace, at Umeå University. I had a conversation with the systems administrator and he told me that updating the entire TeX distribution would require a lot of work, but that I could place the package in the current directory, with my document, and it would hopefully compile.

I hoped so too. But of course, this being the real world, such luck is rare with computers. So I decided to dive into how I might install a LaTeX package locally, in my home folder. I finally managed to find a solution, and I will detail this below.

This “how-to” will not cover how to create a package, or even how to download ready-made packages or whatever. I had the package that I required installed on my computer at home, so I just copied and pasted that over to the work computer. This tutorial will only cover how to make LaTeX look for packages elsewhere than the default package path.

LaTeX automatically looks to see if there are any package installations in $HOME/texmf/. That search path is the default value of the environment variable TEXMFHOME. This is very nice and handy; what this means is we can use “$HOME/texmf” as a folder to chuck packages into. However, I prefer to have files that I never really intend to open in hidden folders (folders whose name starts with “.” (dot)). So the next step I took was to create a folder where I can put packages, in my home directory (“$HOME”).

> mkdir $HOME/.texmf

After that, I went and added a command to my .bashrc file that will change the value of the TEXMFHOME variable to be that of the newly created folder. This way, LaTeX will search for packages in that folder instead. The following lines were added to the .bashrc file.

# LaTeX stuff
export TEXMFHOME=$HOME/.texmf

I usually use pdflatex to compile my LaTeX documents, so if you use some other program, the environment variable you need to change might be different. I compile my LaTeX documents on the command line, so if you use another system, you might have to find another way to export your environment variable, or even use the specific application settings in your editor, if your editor supports compiling LaTeX documents.

Advertisements
2010-01-5

The Naming of the Bovid

Michael Lustfield is crazy. He also wrote a post on his web log in which he, long story short, pretty much claims the GNU project is being a bunch of whiny babies about the whole GNU/Linux / Linux naming convention discussion that’s been going on for a great while now.

I generally share the same opinion as Michael. GNU, in their own documents (as reviewed by Michael), keep saying

  • “We started this thing. We were first. Give us credit.”
  • “GNU is an operating system, just lacking a kernel.”
  • “Most people use a GNU/Linux system which is essentially a GNU system with Linux tacked on.”
  • “We deserve credit because the operating system you use is GNU. (…) Using the Linux kernel.”

I interpret Michael’s opinion to be that “GNU IS NOT an operating system” because they never finished their own kernel, while GNU feels that an operating system is a complete, general system that enables you to do your work with your computer.

It is my belief that it all boils down to one’s definition of “operating system”. I acknowledge several definitions, but one definition is much simpler than the other; it is much easier to just define it as “the kernel”, but if you define it the way GNU defines it you are faced with the difficult problem of where to draw the line. Should a Word Processor count as essential software in the operating system? Media player? GAMES? APPARENTLY according to GNU (again, see Michael’s post and/or GNU’s documents).

This is a problem, so I do actually prefer the OS to be just “the kernel”. This is why I don’t feel bad saying that “I use Linux”, and not that “I use GNU slash Linux”, the latter of which would just sound ridiculous in all honesty.

But even so, if you would take “operating system” to mean what GNU feels an operating system means, you most definitely have to include the kernel, and GNU’s software and the Linux kernel are huge contributions in such an operating system. This is why it would be appropriate to call it GNU/Linux, but then for the same reason, GNU still really isn’t an operating system without Linux, so no, Michael, GNU indeed IS NOT an operating system.

Regarding the quote “with a kernel tacked on” (might be paraphrased), as if the kernel is just a regular piece of software running inside the OS. It’s much more than that—it’s what facilitates all the system resources to all other software in the “operating system”. All this is pretty confusing, but in conclusion, GNU isn’t an operating system by my definition, which is that the operating system is the kernel:

Schematic View of Operating System

Schematic Orientation of Operating System

Interestingly, a Wikipedia article defines the kernel to be, in fact, a part of the operating system, but I don’t think the definition in general is set in stone.

Nonetheless, while the GNU folks do come off as being whiny babies about this, there is truth in that it is important to recognize GNU for the work that they have done and for what they have done and are doing the work, which is a great, great thing. But fussing over the naming of operating systems is definitely not the way to go about doing this. It clearly just causes frustration in the community, and it is dividing it. Luckily I don’t think this fussing is causing a division deep enough to cause any real damage because of shared philosophies, but it is worth being fine with calling it just “Linux”, getting on with your life, and finding another way to promote Free Software.

2009-08-3

The 64-bit Difference

I was just reading about the limitations of the WAV audio format.

The WAV format is limited to files that are less than 4 GB in size, because of its use of a 32-bit unsigned integer to record the file size header (some programs limit the file size to 2–4 GB). Although this is equivalent to about 6.8 hours of CD-quality audio (44.1 KHz, 16-bit stereo), it is sometimes necessary to exceed this limit, especially when greater sampling rates or bit resolutions are required. […] Its 64-bit header allows for much longer recording times.

I got to thinking about computer memory and the difference in capacity between N-bit systems. A computer uses an address to access different parts of the memory. The address consists of numbers (internally, ones and zeros), which for a 32-bit system (where the addresses always have a length of 32 bits) would look something like “af34c97b” written using a radix of 16. A 32-bit uses these addresses to look up places in the memory. Each address stands for a certain byte in the memory, so obviously if we only have addresses with 32 bits, we can’t look beyond the address with 32 ones in a row since that is the maximum value 32 bits hold.

Think of when you’re mailing a letter: you can mail the letter to anyone you want using only two numbers for the house or apartment number. You would be able to send it to (0)1-99 Blah Blah St., but not to the guy living at the end of the street at no. 100. Memory addresses work in the same way.

Let’s do some math now. Say your system is working with 32-bit memory addresses. That means the largest value we could have (the farthest down the street we would be able to send the letter) would be 1111 1111 1111 1111 1111 1111 1111 1111, or FFFF FFFF in hexadecimal numbers. Let’s write this figure out in a format that we’re more familiar with, such as Gibibytes (GiB) or as it is more incorrectly known as: Gigabytes (GB). 1 GiB = 1024^3 bytes; 1 GB = 1000^3 bytes.

FFFF FFFF in GiB is 2^32 / 1024^3 = 2^32 / (2^10)^3 = 2^32 / 2^30 = 2^(32-30) = 2^2 = 4 GiB.

You might have heard already that 32-bit systems only can handle 4 GiB of memory, and now you hopefully know why if you didn’t already. Now then, what happens if we double that number, and make it a 64-bit system?

FFFF FFFF FFFF FFFF in GiB would be 2^64 / 2^30 = 2^34 = 17179869184 GiB, or 16 Exbibyte (EiB). A MASSIVE amount of memory. As you can see, with a double increase in address size, we do not get a doubling of the memory space, but rather a number that is the number of bytes in 4 GiB to the power of 2. 4 GiB = 4294967296 Bytes, and 16 EiB = 18446744073709551616 bytes. These numbers are obviously incomprehensible. So I thought it would be easier to demonstrate them with an example, regarding the Wikipedia article quoted at the top of the article.

As the quote says, a 4 GB (actually GiB) WAV file (with file size header of 32 bits) would give us 6.8 hours of music with a sampling rate of 44.1 kHz, a bit depth of 16 bits and 2 channels (stereo).

If we assume the file size is proportional to the playing time of the audio file if the quality specifications remain the same, then we can calculate the playing time of a WAV file with file size header of 64 bits:

17179869184 [GiB] * (6.8 [hours] / 4 [GiB]) [hour-to-filesize ratio] = 29205777612.8 hours of music.

This number is still incomprehensible so let’s walk up the ladder of time units, shall we? Note that when calculating the amount of years, we will use a year length of 365.2425 days, which is the arithmetic mean of amount of days in a year in the Gregorian Calendar, which has a 400-year cycle and 146 097 days: 146097 / 400 = 365.2425 days. This is to take leap years into account. One could also use a day length of 24 hours and 58.3594503 seconds, but that doesn’t feel as nice, somehow.

29205777612.8 hours
= 1216907400.533333333 days
= 173843914.361904762 weeks
= 39981351.585316605 months (average of 30.436875 days/month in one 365.2425-day year)
= 3331779.298776384 years

So we see that just by doubling the address space, we go from 6.8 hours of music — which I guess you could plough through on a really dull and long bus ride — to more than 3.33 million years of music.

That, my friend, is the 64-bit difference.

… hmm? What was that about 128 bits? Shut up. 😦

No but really, to fill a 128-bit hard drive, it would more energy than it would to boil the oceans of the earth. Theoretical breakdown. Enjoy.

2009-07-18

Open-Source Calculus

I’m (re(re))taking the second course on calculus during the summer at the university. It’s going much better this time around, which is a good thing.

I did have problems with one problem (har), and I couldn’t make sense of it on my messy notebook, and I didn’t feel like doing the entire problem over again. The only thing wrong with my solution is that the answers in the back of the book said term1 – term2, while I kept getting term2 – term1 in my notebook, so I wasn’t too far off but still couldn’t get it right — couldn’t find the erring minus sign. All I really felt like was doing some programming, which I enjoy, but what would I code if I have no software needs?

Then I thought: I need to get a more structured view of this problem. Why not write it up in LaTeX and make it into a nice, good-enough-to-print, PDF solution. That way, I will get some practice writing LaTeX documents (it’s been a few months, sadly), and writing LaTeX is pretty much programming in a way, so I get to practice LaTeX, scratch my coding itch, and maybe find out where that offending minus sign went.

Sure enough, it worked pretty well.

Solution to problem 6.2.9 in Calculus, A Complete Course (Sixth Edition).

I want to point out that this PDF was produced entirely using Free and Open-Source Software (FOSS):

  • texlive LaTeX distribution (tex->PDF compiler: pdflatex/pdftex)
  • GNU Emacs as the LaTeXt editor
  • Ubuntu Linux to run Emacs and tex compiler
  • Totem to play music while writing up the solution
  • Grip + oggenc to rip CDs to Ogg Vorbis
  • Ogg Vorbis media container format
  • libVorbis 1.2.0 used by oggenc (darn you Ubuntu for not updating libVorbis since 2007)
  • and so forth.

It works, folks!

2009-05-15

The Future of visual and auditive media

Some terminology

High-Definition Television, or HDTV, is a broadcasting system with higher definition than standard-definition television, or SDTV.

What used to be

As I understand it, the North Americas use the NTSC standard of television screens with progressive screens of 525 scan lines (lines of pixels) and a frame rate of 29.97 frames per second. Progressive in this context means that the entire raster, or image, is updated in one scan (progressively?).

Here in Europe, however, the PAL standard is more common, with 625 lines, but with an interlaced signal, which means that half of the raster — every other line, starting with the even numbers first and from the top – is updated in one pass, then the other lines in a second pass, with a frame rate of 50 Hz, or 50 passes per second, resulting in a full raster update every 25th second and ultimately a frame rate of 25 FPS.

HDTV

A couple of new standard formats are here, the most common of which are called 720p and 1080p, standing for 720 and 1080 vertical lines respectively. The standard aspect ratio is 16:9, so the digital resolutions would be 1280×720 for 720p and 1920×1080 for 1080p. As a side note, 720p is perfect for my current computer screen as it has a resolution of 1280×1024, perfectly matching the pixel width of 720p to the pixel. But most new computer screens nowadays are 1920 pixels wide.

Typically, 1080p is referred to as Full HD, and you will most likely see more “Full HD” stickers on screens that support it rather than “1080p” ones, probably because people don’t know what 720p and 1080p means. Everybody hates numbers, right?! Screens that support 720p and other, more non-standard, resolutions are referred to as “HD-Ready” (personally, I think this is ridiculous), which is pretty much just a sugar coating to mask the real message, which is “Less than 1080p.” Or worse yet, “Less than Full.”

Always make sure you find out the real resolution of a television or computer screen before you make the purchase, and do weigh in the factor of pixel density. If you go and buy a 40-inch screen, but it only has about 20 by 10 pixels, the picture is obviously going to look like crap, because the pixels will be more like regular light bulbs than little dots. So if you go for a really large television, you would probably prefer a 1080p television, whereas a smaller one would do fine with 720p (and indeed the smaller ones are mostly 720p).

The Future

I stumbled upon a Wikipedia.org article called Super Hi-Vision, and was blown away. Apparently it is an experimental format and proposed standard of High-Definition audiovisual media with a video resolution of not double, not quadruple (as in 2160p, referred to as Quad High Definition), not eight times, but sixteen times the resolution of Full HD: 4320p, with a resolution of 7680×4320 — four times the height and four times the width of 1080p.

Ultra High-Definition Video

Ultra High-Definition Video

A nice comparison, stolen from the Wikipedia article, is shown above (click for true resolution). Notice the light-blue surface as HTDV/Full HD/1080p, and compare that to Super Hi-Vision in darker blue. Astonishing.

Some Numbers

My screen resolution is 1280×1024 on a 17-inch screen. This translates to about 96 Dots-Per-Inch (DPI).

If you would take a Super Hi-Vision resolution of 7680×4320 at a DPI of 96, as with my screen, you would have to make the screen almost 92 inches across the diagonal, or over seven and a half (7.65) feet. Or 2 and one third (2.33) of a meter, I should say.

My screen resolution makes for 1280×1024 = 1 310 720 pixels, or a little over 1.3 megapixels. 1080p makes for 1920×1080 = 2 073 600 pixels, or about 2 megapixels. 4320p gives you 33 megapixels (33 177 600).

Enough said, I think, about Super Hi-Vision. Let’s talk about the sound system that is being developed to go with this super-high-resolution video — the 22.2 surround sound system. 24 speakers total, 2 of which are subwoofers: one left and one right. 9 speakers will be on a top layer, above the heads of the listeners, another middle layer with more speakers, and a bottom layer with a few more speakers along with the left and right subwoofers. I think I need say no more.

I guess the next obvious step in video would be a resolution with the same amount of pixels as the amount of light receptors that are in the average human retina, and in sound would be the walls, ceiling and floor of the room being one single speaker — with some type of dynamically targeting membrane — including the surface behind the huge screen.

Enjoy the future!

2009-03-23

Bloat Warning

On my system, iTunes and QuickTime take up a combined space of over 180 MiB. That. Is. Crazy.

For a piece of software that I use for nothing else than transferring my songs from one place to another, 180 MiB is an astronomical length. iTunes takes up 106 MiB. The rest is accredited to QuickTime. I don’t even use QuickTime! But iTunes refuses to start without it! I use other free and open-source software to play media than iTunes:

On Ubuntu/Linux:

  • Rhythmbox
  • MPlayer
  • Totem

On Windows XP:

  • Winamp
  • VLC
  • XBMC

That covers all of my media-playing needs. Combined, these players, on each system respectively, play just about every format in existence. And MPlayer, VLC and XBMC are available for multiple platforms, it’s just that they run smoother/more gracefully in my experience on the respective platforms mentioned above that I use.

Back to the matter at hand: As I mentioned, I use iTunes for one reason and only one reason — to transfer songs to my iPod. That task does include fetching album cover art, but I count those two as one task. I don’t play any media in iTunes; no music, no podcasts, no videos or movies, nothing! If I have that kind of setup and needs, why shouldn’t I be able to uninstall the unnecessary modules of the software and save some space? I mean I can understand iTunes. It has quite a lot of features, and I assume that all drivers for every single product that iTunes supports comes with the installation of iTunes, but it doesn’t have to be that way!

Alternative #1

Imagine that you just bought a device from Apple. Let’s say it’s a new iPhone.

As soon as you plug in the iPhone for the first time after installing iTunes, iTunes recognizes that you have a new device and suggests to download the necessary drivers, or whatever it needs, to be able to work with your iPhone. Perhaps installing them as plugins, or integrating it with the software in some other way. BRILLIANT. Now, for any other features I like to use in iTunes, enabling them as I go.

Alternative #2

You just bought a new iPod Nano 16 GB (like I have), for example, and you go to the download page for iTunes to download and install the necessary software.

Upon installing iTunes, you are given — oh, I don’t know — OPTIONS REGARDING WHAT YOU WANT TO ACTUALLY INSTALL, as so many other installers do for other software. Then in those options it might say something like “I would like to be able to play music and video”, perhaps separating music and video into two options, and selecting those options would install QuickTime as well (although it’s ~75 MiB is inexplicable to me).

Alternative #3

Release the module that transfers content between hard disk drive and device as a stand-alone application! I understand that you need some type of technology to recognize which types of media can be transferred and played on the actual device, but that doesn’t imply that you also need the technology to play the media before you transfer it to where you actually want to play it.

That was a little messy, so let’s illustrate with a helpful and suitable analogy: You don’t need to take a bite out of the apple to realize it’s an apple! Otherwise I assume people would look at you strangely whenever you go to the grocery store for example. Sampling all the apples before you pick them up.

At the very least, iTunes should include some install-time options. I’m not really sure, but as far as I can remember, I couldn’t even select where I wanted to install it.

2009-01-28

It feels good to be a winner

Yesterday was this year’s Uniaden. A fair held in and closely around the Universum building at Umeå University. The fair has booth spots for many companies that come and talk students who attend the fair to find out where they might end up working one day. It’s a mutual opportunity for both the companies and the students because the companies can make themselves known and attract talented people to work for them, and the students (well at least some) might create connections with the job market and maybe even get job offers.

Just prior to this year’s fair, Ardendo, which I think is a sub branch of Vizrt sent out an e-mail to most of the computing engineers at the university with a challenge and a chance to win a 320 GB hard disk drive.

The entire challenge was contained in a single source code file written in C++. It was a program that would output an image file, if all the bugs in the program were sorted out. I’m not extremely used to C, and I’ve never used C++, but this code was fairly similar to C, so I wasn’t too lost. I managed to at least fix all bugs that I could see, and I got an image file which had an inscribed time stamp and symbol. All participants were to write down this secret time stamp and symbol along with their name and e-mail on a piece of paper and hand it to the folks at the Ardendo booth.

On the day of the fair, I went there, handed in my contribution and had a good chat with the folks there. For the record, Ardendo seems to be a nice place of work. They do some interesting things (create software solutions for professional television broadcasting stations), and they seemed to think that I’d be a good candidate to work there once I’ve finished my education and given my interests and current knowledge.

When the time came to draw the winner of the competition, they took all the pieces of paper, which were different in size, shape and color because everybody wrote it down on their own piece of paper, and indexed them with numbers. The numbers were then mirrored on a different set of indentical pieces of paper, and the winner was drawn from that set. There were about 6 or 7 people who entered the competition.

It turns out I won!

I couldn’t believe it. And it turns out the drive was 400 GB, and one of those smaller, sleek ones that only need enough power to be powered from the USB cable alone, which is brilliant. It was a happy day.

I should mention I didn’t solve the bugs the way it was probably meant, because if I had, I would appparently have seen yet another hidden “easter-egg” message from the creator of the challenge. If I had solved it correctly, I would also have had a chance in an extra lottery where the prize was a beige cardigan with, I think, the Ardendo logo on it.

I’m glad I won the first lottery.

2009-01-8

Zeitgeist – Critical Knowledge for the Masses

I thought I would set up an account on WordPress.com. I feel like it’s much nicer here than at blogger.com. Feels less like “hey let’s keep it extremely simple” and more like “let’s use standards and throw tons of features at our users but order them neatly so it’s not cluttered too much – the users will have to explore them by themselves.” Kind of. Also WordPress is OpenSource I believe (wordpress.org).

Anyway as my first post, I’d like to just copy over my latest post from blogger because I feel like it’s an extremely important post. Here it is:

I recently watched Zeitgeist: Addendum. It is a film that is as astonishing as it is enlightening, frustrating, and appalling – a film that makes you believe there is hope for the future of humanity, but at the same time makes you feel there is just something inherent about the human race and the leaders of the world such that we will never reach world peace.

From the official “statement” page of the Zeitgeist movies’ homepage (link on the bottom).

‘Zeitgeist, The Movie’ and ‘Zeitgeist: Addendum’ were created as Not-for-Profit expressions to communicate what the author felt were highly important social understandings which most humans are generally not aware of. The first film focuses on suppressed historical & modern information about currently dominant social institutions, while also exploring what could be in store for humanity if the power structures at large continue their patterns of self-interest, corruption, and consolidation.

The second film, Zeitgeist: Addendum, attempts to locate the root causes of this pervasive social corruption, while offering a solution. This solution is not based on politics, morality, laws, or any other “establishment” notions of human affairs, but rather on a modern, non-superstitious based understanding of what we are and how we align with nature, to which we are a part. The work advocates a new social system which is updated to present day knowledge, highly influenced by the life long work of Jacque Fresco and The Venus Project.

These two films, or at least the second film, are probably the most significant pieces of expression in history, because they have the potential to introduce radical, and positive(!), changes to the societies of the world and thus the state of the world today.

In short (extremely short), Zeitgeist: Addendum talks about how money is the root of all evil. This may sound like the biggest cliché you’ve ever heard, but you most likely have no idea of how much there is to know about the evil that money brings. You will be blown away.

The knowledge you will gain from these movies are part of essential and important, common knowledge about the state of the world! And even if you don’t have about 4 hours to watch them both, try to take at least 2 hours to watch Zeitgeist: Addendum. If you don’t have 2 hours, try to take 1 hour, 30 minutes or 15 minutes of your day, and finish the movie(s) in parts. It is important. For all of us.

The Zeitgeist movies can be viewed directly via Google Video or downloaded using BitTorrent technology. You can download the BitTorrent files from the link provided below. The files point to high-quality “DVDRips” (almost DVD quality) so that might be the way you want to go.

Links:

Nice BitTorrent clients: