Monthly Archives: February 2018

The Reason We Clap to the Beat the Way We Do

On February 19th, 2018, our HIST 390 class discussed the strange story behind American music and the different perceptions of music worldwide.  Basically speaking, American music distinguishes itself due to the African influences, as once Africans became more integrated into American society and culture, their traditions and culture slowly started seeping into everyday Americans’.  How music is structured and how we, as listeners, clap to the beat were the examples in which we talked about and focused on that day.  To put it simply, Europeans have had a culture where they clap to music at the first and third beats (1 and 3) while Africans have had a history of clapping at the second and forth beats (2 and 4).  Apparently, different people with different cultures grow up in cultures that clap to different beats.  This is made more evident whenever a musician who has lived their entire lives clapping to one type of beat performs in another country that claps to a different one.  While this cultural discrepancy can easily be fixed by quickly adding one extra beat into the song, thus meeting the audience’s mismatched rhythm and ensuring the sanity and consistency of the performers, the irritation of the musicians and singers will likely be evident by the small, but significant and annoying, problem.  For example, a musician, whether he is black or white, coming from New Orleans, which is a city deep and proud of their African culture and heritage, may perform in a venue in Europe, Europe being a population known for being mostly white, and will immediately become bothered when the Europeans clap to a different beat that he or she is not particularly used to.  This isn’t because white people are incapable of keeping a beat, it’s because they were born in a culture where they were taught to clap at different intervals than other countries and cultures do.  Fortunately, when African culture mixed itself into European and Western culture, they were able to mix their cultural beliefs into Western music as well, and with the wide-spread influence Western music has on the rest of the world, the disconnect when it comes to clapping to a song’s beat may have become less common.

This is one of those things that you never realize, but once it is pointed out to you, can’t exactly stop yourself from noticing ever again.  It’s a very subtle aspect of life that most people will never really know or question, but when you really think about it, has always been a part of our lives, no matter how much some will probably wish to deny it.  I’ve taken piano lessons and took a class in choir back when I was in middle school, and all this time, I just thought that clapping our hands to the beat was simply how it was supposed to be done, that when someone clapped their hands to a different rhythm, they were simply doing it wrong and the more musically inclined or learned had to teach them how to do it correctly.  Who could’ve possibly known that even the “right” beats could’ve easily been “wrong” once upon a time?  How crazy would it be to take a time machine and go back in time simply to change the way people clapped?  It would have a wide-spread effect, and unlike changing something like, say, the causation of World War II, the effects would be both widespread and subtle.  I would probably grin ear-to-ear with a stupid smile on my face every time I would get someone in this new timeline to clap to the beat of the music and knowing that the way that they were clapping was the way I had changed it to be and no one would be the wiser.  What a wonderfully petty, yet power-hungry way to use time traveling technology!  I may actually consider doing this if the ability to travel through time ever actually became a thing.

Internet Challenges: Multiple-Choice vs. Linearity

On February 14, 2018, our HIST 390 class discussed the birth of the Internet and how it changed the world upon its conception.  Looking back, World War II really changed things.  The scale of the American Industrial Revolution was high and the resulting “Cold War” between the U.S. and the U.S.S.R. made people fearful of the future, as the destruction of the world had suddenly become a very real possibility.  It became so serious that a huge part of U.S. research went into targeting, which means that they spent a lot of time and resources on how to properly aim bombs, shells, rockets, etc.  At first, the controls were via mechanical analogue, but over time, it was replaced by electric computers and vacuum tubes.  This would be one of the first instances of ever using the Internet, and it wouldn’t be the last.  While the Internet was, at first, limited to DARPA (Defense Advanced Research Projects Agency) it later branched out into ARPANET (Advanced Research Projects Agency NETwork), which would eventually be the very basis for the Internet.  As the full name of the ARPANET may suggest, the Internet was originally what researchers and intellectuals had hoped would be a platform for information sharing, basically for it to be a way for people to share information and discoveries without any “sinister” or “objective-driven” governments and organizations to get in the way.  It was their hope that, with the formation of the Internet, information would be more free, democratic, and open.  Unfortunately, the Internet presented said information in a format that would confuse even the smartest and most intellectual among them.  The Internet would introduce to many people a system that wasn’t linear, but was rather multiple choice in its execution due to the systems being reliant on punch cards.  It would appear that before the Internet became the hub of information sharing that intellectuals had hoped it would be, everyone was going to have to go through a bit of a learning curve first.

To better illustrate the multiple choice aspect of the Internet that confused so many people at the time, our professor showed the class a website that he created himself.  This website, he explained, had greatly confused so many people before because it was not as straightforward in its directions as people were used to at the time, and even when you did figure out what to do, it would lead you with at least four options, which would then lead you to many more options, with no clear path the user was supposed to take.  For example, on the site, there was a chance for the user to meet a conman, but the user would have no real way of knowing that he was a conman unless they had first visited the police station, learned how to spot a conman or found the wanted poster with the conman’s exact face on it.  Again, it was a nonlinear path with multiple possible correct paths one could take.  It didn’t matter if you started at the movie theater or started at the police station, but the consequences for your actions would still be felt if you did not do certain things beforehand or if you took certain risks without thinking them through.  Nowadays, such things are not beyond the usual iPhone, iPad, and Internet-using generation’s understanding, but it is still fascinating to think about how much people’s lives changed upon being faced with such a different and strange mental obstacle.  In all honesty, I feel like I would somewhat prefer a linear style instead of the multiple-choice style we have nowadays.  There is just so many options.  It can get quite overwhelming, especially when we don’t know what exactly we are looking for.  Linearity would at least point me in the right direction.

The Disaggregation of Information

On February 12, 2018, our HIST 390 class discussed the idea of how technology has contributed in the disintegration and disaggregation of information.  To summarize, how do we, as people, distinguish signals from noise?  Typically speaking, signals are sounds that you want to hear while noises are basically everything else that isn’t the intended signal.  This is made clear every time we use a phone.  The signal is the voice on the other end that we’re trying to focus on while the noise is the electrical hum, static, crackle, bugging, and the general deterioration of the signal as it traveled from one end to the other.  The further the signal had to travel, the worse these problems and noises got, which begged the question of how we could keep the signal strong without losing any of it.  The telephone’s way of attempting to remediate most of this was to use a system where, once the phone heard the sound that it was supposed to focus on (the signal), it would send an electrical current to the other end, which would then activate vibrations that would successfully send the signal in a way that can clearly be heard.  This is technically the first instance of disaggregating information, as the information that the person on one end is trying to convey to the other person on the other is being reduced to mere instances of vibrations.  Another big example of the disaggregation of information was when Claude Shannon, a mathematician, helped bring about ideas that would eventually be the very groundwork for most, if all, computer programs today.  He brought up the concept of Boolean algebra, a type of algebra that basically simplified everything to a yes or no answer.  Shannon realized that, when one really thought about it, all questions in the world could be answered by either answering yes or no.  This became the backbone to many computer programs in the near future.  When you want to send something to someone else, instead of actually delivering and transmitting that message, the message is instead scanned over and copied and filled in on the other end.  For example, if I were to send a painting electronically to someone, their computer would then scan to see if the color black was used within the picture, and if it did, then it would hit “yes” and fill in that exact same space on its own.  The process is then repeated with other colors until the painting on my friend’s computer is the exact same as the painting I had intended to send him.  Basically, the computer had turned the painting that I was planning to send to my friend into bits of information, which were then sent to his computer.  The computers, in other words, disaggregated the information.

One of the biggest questions presented during this lecture was whether the disaggregation of information cheapens the very information, message, or meaning behind it in the first place.  Regarding the previous examples, hearing a message through the phone may take away from it, at least if you compare it to possibly hearing it face to face.  Which is more powerful: hearing that your father is dead from your mother who is standing sadly in front of you or hearing it from the other end of a phone?  Same goes for the sending of pictures and paintings through the Internet.  If you can see the same things that your friends have seen simply because your friends sent you the pictures that they took when they went out to experience it for themselves, did you truly see what they saw?  Has the experience now been lost on you?  To further help us understand this argument, our professor used GarageBand to show us a couple of songs he had made beforehand, some of the music using sound clips from singers who no doubt took time out of their probably busy lives to get up and record some clips for the program and some instruments that were probably being played by professional musicians who have spent years and years perfecting their craft.  All of these sounds and singers probably had backstories to them and yet here they were, on a music making program being reduced to being disconnected clips to be used for its ignorant user’s pleasure and enjoyment.  In regards to this issue, I will agree that there is certainly something being lost here, but I will also point out that in return, these pieces of information are far more accessible than they used to be, creating possible new experiences that would’ve been impossible to make beforehand.  That seems to be the general give and take with technology.  It makes things more accessible while also cheapening the real thing in the process.  On the other hand, if you want to see the real thing, it will no doubt be less accessible, but you’ll be experiencing what it was meant for you to experience in the first place.  It all comes down to what you care most about and if you are willing to go out of your way to truly experience it or not.

The Information Revolution

On February 7, 2018, our HIST 390 class discussed what may be considered one of the greatest and most revolutionary of human innovations of our time: the ability to sort, manage, record, and categorize information.  To summarize, during the American Civil War, a man named Montgomery Meigs, who was a quartermaster general, was tasked with managing the North Union army, which included the task of ordering his men’s uniforms for them.  In regards to this, Meigs specifically asked for all of his soldiers to be measured to figure out the minimum sizes for their clothes, and thus ordered clothes around that size in bulk.  Basically, he was able to statistically achieve a solution to what seemed like a tedious and time-consuming endeavor, and solutions such as this would not be the last.  Around this time, IQ tests were made in order to gauge the many soldiers’ level of intelligence (which sadly resulted in the army finding out that the vast majority of their soldiers were at moronic levels of intellect), once again taking what used to be a long and daunting task and simplifying it enough in a way to make it much, much easier and faster.  To a lesser degree, libraries were also beginning to conduct a system in which finding the necessary book was to be made much easier and file cabinets were made to better help manage personal records and files.  In the end, while we may take it all for granted sometimes, the ability to manage data truly revolutionized how we live in the world today.  The Information Revolution had made its mark.

How great the fruits and results of the Information Revolution have been to us!  I remember back in my old university (all the way in Northridge, California) where I went to use a book from a university library for the first time.  Imagine my shock when I was instructed to go up to the information desk, ask for the book, have them search it up within their databases for me, write down the book’s number, and instruct me to look for the aisle that it was placed at.  Even after I understood what I was supposed to do, finding the book in a sea of other books, all with indistinguishable covers and some not even having their titles on their cover, took longer than I was probably comfortable with.  Besides the library number that it was assigned with on the side, there were no other visual cues to rely on, and whenever I finally found a book in that library I always breathed a sigh of relief.  How daunting this task must’ve been if I had been born a few years earlier, in an age where computers were not commonly used enough and where I most likely had to look at each and every book one at a time in order to find what I was looking for.  And speaking of which, how was I supposed to know what I was looking for back then?  There were no computers back then, so how was I supposed to search up the perfect book that I would need in order to write my paper with?  I would have had to look at every book that vaguely mentioned the subject that I was currently working on and skim through it in the hopes that it would have something, anything, that could possibly help me out!  I’ll make sure to remember to count my blessings for being born in this generation, rather than one of the many older ones.

How Times Have Changed Due to Technology

On February 5, 2018, our HIST 390 class discussed the contrasting attitudes of the past and present and how the advancements of information storage throughout the years may have inadvertently forced a crutch onto all ongoing generations.  As for the first topic, the contrasting attitudes of the past and present, it is a topic for debate whether everyone throughout time is naturally blessed with the same concept of “self” and differences and contrasts are only made due to the environment and technology that they are exposed to throughout their lifetime, or that the concept of “self” is completely different when comparing those of two separate generations.  In other words, is everyone throughout history the same deep down or not?  Would Thomas Edison, George Washington, and Joan of Arc love roller coasters just as much as we do now, or would there be a generation gap that would make them turn away from or even look down upon such rides and contraptions?  Surely, we are not mad for loving roller coasters, but has times changed so much that the people of the past just can’t see what we enjoy about them, or even if they do, they cannot seem to share the exact same sentiment?  And how does this lead into the next topic of how the advancements of information storage throughout the years inadvertently forcing a crutch on all continuing generations?  Well, to put things in perspective, the late and legendary thinker Socrates had a student named Plato, who recorded all of his talks for him, as Socrates himself did not have the ability to read or write, and surprisingly enough, he was proud of that fact.  Why would a great mind like Socrates be so proud of not knowing what is now considered a necessary and useful skill?  The reason for this may have been the fact that writing gave people an out from actually taking the time to memorize and take information to heart.  Let’s be honest, when you know that a copy of the information is lying around somewhere, you are less inclined to actually memorize it, opting instead to just remember that a copy of the information exists and where you can find it so that you can refer to it the next time you need it.  The most common example of this is the internet, more specifically, the search engine known as Google.  Nowadays, everyone knows about and uses Google in some way, shape, or form.  If you need to find out about something, simply Google it.  Chances are that you’ll find what you’re looking for in a matter of seconds.  Be honest, though.  The ability to search and find any and all information that you want has made you somewhat reliant on it.  When was the last time you truly tried to remember what you’ve just read?  When was the last time you were told to memorize it and immediately thought, “Why should I learn how to memorize it?  I can just look it up whenever I want.”  Probably more often than you would like to admit.  The sad thing is, in the times before such technological advances, or even writing and recording for that matter, people were forced to memorize everything.  They had to.  They had no other choice.  It is in this way that our ancestors have the upper hand over us.

Continuing on with the concept of modern technological advancements deteriorating our ability to memorize anything, I can safely say that I have definitely felt this.  In fact, a part of me honestly believes that I may be mentally impaired when it comes to memorizing things nowadays.  My attention span is pretty low, so when someone is trying to explain something to me, I probably at most only really hear about half of it.  I usually need either active experience in something or constant, repetitive reviewing in order for anything to truly stick, which is terrible for someone who is trying to pursue a Biology major.  Biology is a pretty demanding and memory focused subject as it is, but attending multiple courses of it really tests the limits of my ability to memorize information for my next exams.  While it may well be my own lazy and self-destructive behavior that constantly puts me into these situations, being born in an age where memorization is nowhere near as prevalent as it probably once was probably doesn’t help.  In an age of internet search browsers, audio and visual tutorials, and global positioning systems, the number of times actually knowing something by heart was demanded has been very low.  I shudder to think what would happen to me if the internet or the Wi-Fi systems were to suddenly and permanently go off.

The Concept of “Self” and How it has Changed Over the Years

On January 31st, 2018, our HIST 390 class discussed the topic of “selves” and how the concept of “selves” has changed over the years.  To summarize, everyone has multiple selves within them, whether they truly realize it or not.   When you are partaking in a job that you do not particularly enjoy, there are technically two selves that you can consciously acknowledge: there is the self that hates the job and then there is the self that acknowledges that you agreed to the job in the first place.  Neither are more legitimate than the other, but they are conflicting parts of yourself that you can acknowledge both exist.  It is the same with reading silently to oneself, as there is a self that is reading silently and there is a self that is trying to listen and absorb what is being read.  You’d think that you would do either one action or the other, but both are existing simultaneously regardless of one might think.  People having multiple selves is nothing new, but what was discussed in class was whether different generations and eras had different selves than those present in the current generations’.  As proof of this discrepancy, the professor showed us a clip of a movie from the past where an entire squadron of fire fighters went to go save a woman and her children from a burning building, the major takeaway being how it was directed compared to how modern films were directed.  Compared to a movie like, say, “Saving Private Ryan”, where the perspective of the viewer continually jumps around to show the viewer many different perspectives of the scene, the clip shown to us first showed the scene of the fire fighters saving the woman and her children from their perspective, showing the woman panicking as she sees the smoke all around her and running around in terror until a firefighter breaks into the room and leads her and her children to safety.  The movie then shows the same scene, but through the point of view of the fire fighters in front of the house.  At the time this particular movie and scene was made, it was believed that it was impossible to see all aspects of a story at the same time, and so they showed all of the different perspectives one at a time.  On the other hand, modern movies such as “Saving Private Ryan” showed various points of view, some being quite impossible if one actually took the time to think about it.  The movie even took a moment to show you the scene from the perspective of the enemy forces!  Surely, this is proof to the belief that people’s understanding of self has changed over the years.

In an attempt to explain this difference in attitude to the concept of “self”, I would hypothesize that the reason why modern movies such as “Saving Private Ryan” show so many perspectives at the same time throughout a scene is because the birth of the digital age has made our generation more used to information overload, and because of that, movie studios choose to take advantage of this and essentially give the viewer an information overload of their own for any or all particular scenes in their movies, and because we are used to absorbing so much information at a time, it doesn’t bother us as much as it probably should.  If anything, we are at our most comfortable when we are given information overload, especially when we know that we aren’t going to be tested or quizzed on said information.  With portable phones and with the internet at our very fingertips, this generation is constantly bombarded, by their own volition, with a heavy influx of information.  We just can’t get enough.  In fact, if you look at movie reviews nowadays, a common complaint is that the movie did not explain enough about the plot, scene, or character motivation.  When we are truly invested, we crave the information, whether it is even really important in the big scheme of things or not.  Perhaps in the simpler and more primitive times of the old movie clip of the fire fighters saving the woman and her children from the burning building, all of the information that we absorb now would be far too overwhelming for them to handle, at least all at once.  Whether the current environment of information overload is a blessing or a curse, however, is another topic altogether, and it might not even have a true correct answer in the first place.

How Technology Found a Way to Control Time

On January 29th, 2018, our HIST 390 class discussed the topic of technology molding, controlling, and changing our perception of time and how exactly it managed to get away with it over the years.  To summarize, as technology within the U.S. slowly continued to advance and improve, many things about everyday life slowly began to change along with it, whether people realized and approved of it or not.  For example, with the creation of railroads came new possibilities for the sake of luxury, comfort, and livelihood, as trains made transporting food from across the country easier, and therefore, more readily available for those on the opposite side of said country.  Suddenly, if you were living in a region that couldn’t readily grow carrots and carrots just happened to be your favorite food, then you had nothing to worry about as the food is now more readily available and accessible to you than ever before due to the wonders of railroad and storage technology.  And fret not, for those carrots will now forever be easily accessible and available to you, as because of the wonders of this technology, your local supermarket can continue to sell you these carrots as long as the trains continue to be functional and as long as they continue to deliver them to you.  All of a sudden, the concept of days of waiting for a delivery of carrots or making the long and hard journey towards those carrots is turned into a distant fantasy.  Your concept of time has been changed because the things that you used to have to devote so much time to has now and will potentially forever be made much faster.  To stay on the topic of trains and railroads, it is because of their ability to transport and reach locations so quickly that time zones had to be made.  Back then, vague statements such as, “We close at night,” were no longer enough.  With trains having to move at a tight schedule to get the most out of their usefulness and benefits, a much more specific system for time had to be crafted in order for everything to work out, and because the trains had to adhere to such as system, everyone else, for the sake of simplicity and consistency, had to adhere to these times as well.  Now, instead of, “We close at night,” it had to be, “We close at 9:00 P.M.”  Technology, in a sense, had found a way to control, mold, shape, and bend time to its mercy.

As a senior college student, I can honestly say that the concept of technology controlling our lives is a bit too real for me.  As a senior student in college, I already have to compile a basic school schedule for myself, which already tells me how I am supposed to spend a good chunk of my day, but add that to all of the homework and exams these classes demand their students to prepare for, and suddenly, you need to start worrying about how much time you spend devoting yourself each of these tasks, and then another, and then another, and then so on and so forth.  If you laze about, which I must admit I am very prone to doing, then you are “behind”, whatever “behind” really means in the grand scheme of things.  All of a sudden, you’re trying to “catch up” even if it isn’t really necessary to.  It doesn’t matter if the assignment and the exam is a good amount of days away, you’ve been slipping on your schedule, and thus have to find a way to fit in everything that you’ve missed out on into the even smaller time interval that you’ve gotten yourself into.  The stress, the hair pulling, the anxiety, it’s all building up, and for what?  To get the assignment done by 11:59 P.M. at midnight?  To study for that test that is going to start at 9:00 A.M. and if you are late for it then you bring on the risk of not being able to take it at all if someone manages to leave the room before you enter it?  It is very telling that activities that are considered fun, relaxing, and leisurely are seemingly never associated with such troubles as you “need” to be done and you “have” to be done by a certain predetermined time or else some perceived consequence was going to happen.  And the irony is, I get it.  I really do.  It would be so troublesome for both the bar owner and you if you came to their bar the moment they were about to lock up and the both of you began to argue on the technicalities of, “We close at night,” and whether the sky was a clear indicator of it being truly night or not.  All I’m saying is that a life with clear timetables while also being a bit more flexible on its deadlines would probably be nice is all.

Idealism vs. Realism

During one of our classes, the concept of idealism vs. realism was brought up as the professor questioned why the current generation doesn’t go out of their way to demand that their music be better instead of settling for what the Loudness War and compression has done to it, as both have made it, in his opinion, objectively worse as a result.  He then explained the difference between idealism and realism, stating that idealism and religion go hand in hand as all who adhere to idealism believe in there being a higher or perfect standard for everything and that it is our duty as humans to strive towards reaching such standards.  Realism, on the other hand, is based on the belief that this so-called, “higher or perfect standard” does not exist and that, basically, what you have is as good as it is going to get.  Throughout the class discussion of these two topics, an example made to help distinguish between the two was that a realist would not care if they saw a replica of the Mona Lisa or not.  After all, a painting is just a painting, no matter if it is a copy or not.  Opposite of that, however, an idealist would insist that you would have to see the actual Mona Lisa in order to truly experience seeing it.  The idealist stance is admittedly a bit elitist, but as pointed out by the professor, idealism is unapologetic in this regard.  If you truly want to experience something, one must put in the work to do it, otherwise, what is the point?  The ideal standard is a struggle to reach.  An idealist would never deny that, but then there are the realists who have no standard to strive for and that is where one should seriously consider where they are in terms of the idealism vs. realism scale.

During the class, another point was brought up about a man who came to America and was appalled at the low-quality bread being served there.  As one student questioned whether there could just be a manufacturing and economic reason behind it, the professor then informed the class that there actually wasn’t.  When the calculations had been made, it was found out that, if the seller had taken the time to make the better kind of bread, it would have actually been cheaper to produce than to produce the bread that they were currently selling.  This again brought up the question of why the sellers were not pursuing the idealistic standard for bread and why, when he informed others publicly of what he found, that no one made too much of a fuss over it.  I would personally like to theorize that this is due to what could very well be in the middle of the idealism and realism scale, that middle being nihilism.  To put it simply, nihilism is the belief that nothing truly matters in the long run, and that just might be what the buyers of such bread back in the day were probably adhering to when they ate their low-quality bread.  I personally theorize that nihilism is simply idealists being crushed by the reality of realism, which causes those to question if the quality and standards of bread or anything in general really matter in the long run.  Imagine being a person living back then, longing to taste what the so-called famous “bread” was like, and when it finally came you couldn’t get enough of it.  Finally, you are able to taste the delicacy that had been denied to you for so long… and then some guy comes out of nowhere and proclaims that the bread you loved so much wasn’t the quality product that you had previously thought it was.  Do you feel dejected, knowing that the bread you had long waited for is still well out of your grasp, or do you simply shrug your shoulders and accept that, while it isn’t the best, the bread that you have is as good as it is going to get?  The latter isn’t really realism, since you acknowledge that there is a better alternative to what you have, an ideal bread, but based on your current situation, you realize that it’s an experience that you will never realistically be able to truly experience, so you just take what you can get.  It doesn’t matter anymore.  You’ve accepted it, despite knowing that it could get better.  It could get better, but it most likely probably won’t.  As for the sellers themselves, it is probably a cynical marketing and business move, if anything.  First, feed everyone the bad bread, and then, when you see your business booming or slipping, reveal a slightly better kind of bread and repeat the process so that you never lose the public interest in your product.  Sure, it is a shrewd and dishonest method, to be sure, but why does it matter in the long run?  With this in mind, has nihilism touched the music industry as well?