Independent Cinemas Face Uphill Battle

For a little more than a decade, The Lyric Cinema has brought the independent film scene to the Fort Collins community. Year after year Fort Collins residents wanting to see films on the indie circuit or just wanting to get away from big blockbusters have sought out The Lyric’s unique offerings and affinity for the classics. But while Fort Collins supports the Lyric by having a strong academic presence through Colorado State University students and vibrant artistic community who are both eager to explore new realms of cinema, many other theaters are not so lucky.

The past two decades have seen a trended decline in the number of cinema sites from 7,744 locations in 1995 to 5800 in 2018, according to the National Association of Theater Owners. Many towns have experienced what it feels like to see a movie at their local theater on Saturday, only to see it shut down on Monday. While most of this article will focus on indie theaters and smaller chains like the Alamo Draft House, much of these problems are frightening for the entire industry.

But to really start to understand the unruly complex of problems facing the independent theater industry one must clear up some of the biggest misconceptions, namely the misconceptions around Netflix.

Competition (or lack thereof) from streaming services

If you ask anyone what the biggest threat to theaters is, they will likely answer streaming services in a few seconds, likely due to a combination of campaigning from major studios and theater owners as well as a plethora of think/opinion pieces from news outlets.  When looking at the numbers however there doesn’t appear to be a definitive correlation between a decline in theater attendance and the rise of Netflix.

While annual ticket sales have had a declining trend in the last few years, this decline started in 2002 according to The Numbers a site that analyzes and cross-references film industry statistics. While ticket sales have gone done steadily, total annual box continues to rise. Going from $5.3 billion in 1995 to $11.9 billion in 2018. According to Google’s trend tracking, Netflix only started to garner serious attention in the U.S. in 2008. (insert image)

It’s also worth noting that theater attendance has been relatively low for decades. A 2002 study by Michelle Pautz on weekly movie theater attendance shows that the percentage of Americans who went to movie theaters on a weekly basis has hovered around 10% since the late 1960s. Thus, low theater attendance and declines in sales have been an ongoing trend long before most people heard of the most popular streaming services.

New Technology

Paramount made headlines in 2014 when it announced The Wolf of Wall Street would have very limited film stock distribution, meaning the film would primarily be distributed and shown digitally, according to NPR. Many studios have made an effort over the last few years to switch to a digital-only format. While a small fraction of film distribution still makes film stock for analog projectors, that practice continues to dwindle and more distributors and studios are pressuring smaller venues to switch to digital by simply refusing to distribute certain movies.

A professional grade theater projector can cost anywhere from $40,000 to over $100,000 dollars. For smaller studios just purchasing only one of these projectors would be a considerable expense, let only purchasing multiple.

Many independent theaters, including Fort Collins’ Lyric Cinema, turned to crowdfunding to raise funds. But while some independent and small theaters managed to raise enough funds, many failed to reach their necessary goals.

Studios and distributors are pushing for digital filmmaking and reproduction for numerous reasons. Digital film is much easier to work with during filming, editing and distributing. Digital footage is also much cheaper to copy, meaning it would be distribution companies best interest to push digital video to cut down on costs.

As the years go on, new technical standards will be introduced which will require more new equipment. But the costs do not stop at new equipment.

General Costs

Expenses don’t just stop at projectors or sound equipment either, a movie theater is a series of large expenses from top to bottom. A large profession popcorn maker can cost around $1,000, a soda fountain can cost anywhere from $4,000 to upwards of $11,000, and registers can range from $200 to over $1,000 depending on the level of quality. Taking into account that there is much more equipment needed to operate a theater and often times multiple of one item, theaters, even small ones, require large amounts of capital just to open.

And the list of expenses only gets bigger. A theater needs staff to operate the theater, food and drink need to be regularly reordered, utilities, rent or loan payments, property tax if you bought the land, repairs for broken equipment and general upkeep.

On top of all of this, theaters need to regularly deal with studios and distributors who have continually asked for bigger shares of revenue, according to Variety.

Dealing with studios and distributors

In order for films to be legally shown at venues for public consumption, you are given two licensing options, according to Indywood. Distributors will either allow you to pay for a one-time licensing fee or you have to surrender a large percentage of your profits.

When a theater and distributor cut a percentage deal, the first week the distributor takes a large portion of the profits (the actual percentage varies according to the theater and movie), then after the first week, the distributors cut gets smaller with each week until the theater stops playing the film. Over the years though, studios have also been asking for higher margins overall, leaving less room for theaters to make money in post-first-week sales. This compounds the fact that most traffic for a film dies after the initial week, meaning theaters wouldn’t be making stellar sales in subsequent weeks in the first place.

The majority of theater profits are made from concessions. Concession charges are so high because theaters are making up for the profit lost in the distributors cut of ticket sales. Former president of the National Association of Theatre Owners said in 2007 Boston Globe article that concessions can make up as much as “46 percent of profits” for a theater.

Distributors and studios also engage in several practices that prioritize larger chains over smaller theaters, according to Corpwatch. Studios and theater chains routinely sign exclusivity deals that exclude smaller theaters within a certain mile radius from showing certain movies at the same time as the large chains. Theatre chains have a long-standing history of undercutting independent theaters.


In the late 80s and early 90s, theater chains began to expand by opening large multiplexes and even megaplexes across the country. These massive theaters would usually have around 10, but megaplexes had upwards of 16 theaters. The usual small or independent cinemas typically had between 1-4 theaters. While venues with more than 4 theaters did exist for the 90s boom, it was during the 90s that their prevalence really took over America and first started to become a threat to smaller cinemas.

To reiterate, America saw a decrease in in the total number of cinema locations from 1995 to 2018 from 7,744 to 5,800, according to The National Association of Theater Owners But while the number of locations has been decreasing the total number of screens has been growing. In 1990 there was a total of 23,814 screens in the U.S. according to the National Theater Owner Association. By 2000 the number of screens rose to 36,379, later to 40,837 by 2018. So while many cinemas were closing there were more screens to replace them.

It also doesn’t take much knowledge of economics to understand why these large theaters drove so many smaller theaters and chains out. More screens means more opportunities for viewers to see the movie they want, it also means more time can be dedicated to the most popular release at the time. Bigger auditorium spaces in which the seating is designed to fit as many people as possible allows for more tickets to be sold and more viewing opportunities.

In contrast, independent theaters are often much smaller and have far more limited seating, thus even though their cost of operations is lower, smaller theaters still need to charge more to cover costs. Put simply, large chains have so much more to supply to audiences that they can charge for items at a lower cost.

Unlike smaller theaters, chains like AMC provide their own support network for its theaters. Theaters in lower traffic areas don’t have to worry about going without pay so long as most other theaters are doing well.

With so much to offer at a cheaper price, more people turned to chains than their independent theaters. Soon local theaters either began to sell out to the bigger brands or just simply went out of business. Little by little the biggest theater chains like Regal, Cinemark, and AMC began consolidating the market and pushing out competition. Now those three chains own 50% of the total number of screens in the U.S., according to Variety. But small theaters are not the only ones safe from consolidation, in 2016 AMC bought out Carmike Cinemas and its 271 locations for $1.2 billion dollars, which made it the largest chain in the U.S., according to an LA Times article. Now venues with less than four screens only make up 11% of the total market according to 2018 numbers.

A new approach for indie theaters

Many indie theaters have not stood by idly while large companies continue to consolidate, many have been reworking their strategy to offer a worthwhile alternative to the conventional theater experience. Independent theaters have started to focus more on small time or independent films that don’t get shown in major theaters as a way of attracting audiences for a different experience. Small time and indie distributor have also been easier to work with for many theaters, and these distributors are eager to get films in as many theaters as possible.

Small theaters like the Lyric have also taken steps to make the theater experience more comfortable by offering more comfortable seating like couches and reclining seats. These theaters have also started to implement better food and drink options by implementing restaurants and bars. Some theaters are even offering whole other forms of entertainment like game rooms, VR, bowling and even live performances.

In short, these theaters are offering not just alternatives to the blockbuster releases but offering a whole different experience that brings out the community. But in business, if there’s money to be made companies will discover it eventually, and no good idea goes un-stolen.


Recently large chains have also been looking to offer a more high-end experience, according to The Wall Street Journal. Both Cinemark and AMC have spent large amounts of money in recent years to renovate theaters by also implementing more comfortable seating, dine-in options, and bars. Companies have realized what drove a lot of customers away from theaters is the level of discomfort they experience in the theaters, so while they won’t necessarily change their pricing, what they will do is simply offer better options.

A change in these companies’ priorities is not only a threat to these smaller theaters, but it’s key to remember that these expansions and renovations are not free. These projects small theaters are taking in order to compete are investments, but in order for those investments to pay off the community around the theater must be receptive to those changes. Some theaters have already spent large amounts of money without much luck, according to Variety.

For as much as large theaters like to give off the image of struggle the simple truth is, they aren’t struggling to nearly the degree smaller ones are. AMC made over $11 million in profit in 2018 according to their financial reports, and because they’re so big they can easily keep making more money. And with so much money theaters can continue to adapt to the culture and industry changes with ease whether it be digital projectors, services, or simply installing new seats.


I write other things

Some of you might not know that I actually write for my student paper. Here’s my most recent article about student discounts.

On Memories

Memories are strange, for lack of a better term. Sure, there are plenty of terms that describe aspects of our memory, but fallible is one of the more accurate ones, that’s why a witness’s recollection is so easy to dismiss in court. Memory is also incredibly fickle, being subject to change at a moment’s notice. In this dynamic respect, memory is likely the least helpful of all our cognitive functions, except the part of your brain that compels you to constantly overthink every small detail of something, that part can fuck right off. I’m sure you’re eternally grateful to your memory when you remember the correct answer on a test, followed by cursing it for not recalling the answer to the next question.

The mind is constantly changing memories, old or new. Your brain starts to change your memories as soon as your done experiencing the event, then the next time you remember the same memory, you’re remembering the recollection, not the original experience. Like a mental game of telephone, each time you remember something that memory is altered slightly. We imbue our memories with more meaning and significance than they had in the moment, reframing them from different positions, adding/subtracting details, changing the setting, inferring information we didn’t previously have access to. In short, we dramatize the moments in our lives as though they were scenes from a movie, continuously remaking this one film throughout our entire lives. A childhood memory isn’t reflected upon at 20 the same way it is at 70. Our brains are directors that spend their whole life reworking and editing their magnum opus, like they’re never satisfied with the reality, feeling the need to inject story elements where there are none.

There are some memories that stick with you as if they’re permanently etched on to the inside of your skull. The memory equivalent of the cave paintings of Lascaux. I’m unsure if these memories stick with me because they altered the course of my life in some way, or I remember them because they resonated with who I was. Maybe these memories were just particularly traumatic or large in the course of my life and have no special significance.

I remember a fishing trip I took with my father one time. We caught a fish but instead of hooking it through the cheek, the fish had swallowed the hook and it got lodged in its throat (or whatever the fish equivalent of a throat is). I don’t remember who caught the fish, but I think it was me. We tried for several minutes to remove the hook, but eventually, we realized we had to give up on it and threw it back. The fish flopped around for a few minutes before finally dying. I was completely mesmerized, I watched it struggle until it laid still, side-ways and wide-eyed. It may be cliché, and hypocritical, to imbue this moment with meaning, given what I wrote earlier, but I think this moment stands out in my memory as the moment I recognized the fragility of life. A single moment of no pomp or grandeur could mean the death of any life. That moment could be a single second or an eternity, with the only guarantee of death being no guarantee in how it happens.

Another moment sticks out in my mind as one of my more prominent memories, the moment I found out my brother had been in a car crash. I remember being in class, but I don’t remember what the teacher was talking about, who was there, or what the classroom looked like, I just remember that I was there. I don’t believe people when they say they can recall small details about a scene right before a big moment happened. I already mentioned how the brain creates small details to add life to a scene, but it’s also unlikely people recall these small details because they weren’t paying special attention to them, and it’s a strong possibility that whatever happened next overshadows all other details of that day.

I know I was called to the office, which I was probably happy about. Understand that at that moment I knew nothing about my brother’s wellbeing, so any kid hearing they were being checked out, especially for no discernable reason (i.e. doctor’s appointment, etc.) would probably be ecstatic. I packed up my things, probably beaming at the idea of going home early (I never cared for school), and made my way down the hall. The school office was around the right-hand corner of the hall (funny how I remember that one aesthetic detail), so my mother was waiting for me at the end of the hall. As I got closer I was unnerved to see she was crying. When I got to the end of the hall my mother told me what happened.

I remember this scene very differently. What I’ve told you so far was a deduction of what must have happened according to anecdotal pieces of my memory, and my knowledge of my past self, things that I can know with some degree of certainty are true. I remember smiling, followed by that smile disappearing into a worried expression as soon as I saw my mother crying, then running towards here in a panic. Followed by several smaller events I’m sure adds more drama to the scene. Now I have the luxury of understanding that moment in full context, thus the ability to imbue it with more significance than it may have warranted at the time. Perhaps that’s the point of reflection, to give understanding and importance to moments that seem frivolous in the moment. After all, didn’t these moments have profound effects on me, but I was unable to fully interpret it as such then and there? You can’t say definitively if reflecting on memories is good or bad, I think it’s more dependent on the types of changes you make. What details did you add? Are those details based in any truth? Did something actually happen like you remember it? Are you adding overtly biased details? But when you start getting this intricate and suspicious of which parts of your memory are true or false, it’s best to go ahead and close this line of thought before driving yourself crazy with doubts about your experiences.

The History of Halloween

The history of Halloween is a weird mixture of ancient traditions, political gain, propaganda, and cultural mixing.

If you asked people to tell you where Halloween came from you likely wouldn’t get a solid answer. Most people who do answer confidently are likely to tell you it originated from the practices of druids or Satanists before being spun into a benign, commercialized, holiday. In actuality, these assumptions couldn’t be further from the truth.

Halloween originated with the Celtic holiday of Samhain which began on October 31st. The Celts believed the year was broken up into four distinct sections and followed a pattern of death and rebirth each year, with November 1st commencing the new year. Samhain was partially to celebrate the harvesting of the crops for the winter and the recalling of herds back to stables.

Maintaining the belief that this was a time of death and rebirth, the Celts also believed that on the 31st the barriers separating the world of the living and the world of the dead were at their weakest, allowing spirits and people to interact with each other.

The Celtic people believed that the souls of people who died that year would pass on while spirits from previous years would come back to interact with the living. They lit bonfires both to guide recently deceased spirits to the next life while warding off malevolent spirits. Sometimes participants would wear masks to hide their appearance from spirits, but the practice of wearing costumes would not become standard practice until much later.

It was also believed that magic, notable divination, was especially strong during Samhain. There were various practices developed with the hope of predicting the faiths of lives, relationships, harvests, and family life. One practice involved holding a mirror while walking backward to the basement, the face they would see in the mirror would be their next lover.

A sizable portion of Europe still practiced pagan traditions. To combat this the Catholic church began supplanting traditional pagan holidays with newly formed Christian holidays. The church tried to replace Samhain with All Saints Day, a holiday dedicated to all Christian saints who didn’t already have a holiday dedicated to them.

In addition to replacing their holidays, church missionaries began to label parts of the Celtic religion as evil. Druids became devil and demon worshippers, the Celtic gods became associated with demons, and the Celtic underworld became synonymous with hell.

Despite the church’s best efforts the traditions and celebration prevailed, despite being transformed. Now October 31st hosted ghosts, witches, demons, fairies, and a slew of other creatures. These creatures were also now seen as entirely malevolent. Eventually, people began leaving out food or drink to appease these malicious creatures with the hope that the creature would take pity on them.

As time went on people began dressing as these creatures and would go door to door putting on shows or harassing people in exchange for food and drink. These customs were mostly constrained to Ireland and Scotland. In England people would give out “soul cakes” and people (usually people in the lower classes) would go “a’ soulin’” for these cakes.

The next few decades were rough for Halloween. After the Protestant Reformation began, England mostly stopped celebrating the holiday. Since Halloween is the eve of All Saints Day, and the new religion did not believe in saints, most didn’t see any purpose in celebrating all together, despite Halloween by this point largely being neither pagan nor Christian.

In the early American colonies, the celebration of Halloween was mostly outlawed in part because of the church’s propaganda becoming truth for some people and the large amounts of vandalism that usually occurred that night.

Halloween didn’t come to America until the mid 19th century when a large number of Irish immigrants came to America. The holiday was mostly the same in America as it had been in Ireland at the time.

The commercialization of Halloween didn’t start until the early 1900s, with costumes and decorations appearing in the 1930s. To combat the vandalism that often occurred on Halloween night, people and government leaders began labeling Halloween as an exclusively children’s holiday. By the 1950’s the custom of trick-or-treating was nearly unanimous and the holiday was now considered only for children. With the exception of adult Halloween parties, traditions have largely remained the same since.

Daily Journal Challenge: 6/19/17

Prompt: What do you hope to gain from doing this journal?



Significantly better writing composition skills.