Monday, August 29, 2016

Garbage Land: On the Secret Trail of Trash



Book Review: Garbage Land: On the Secret Trail of Trash – by Elizabeth Royte (Little, Brown, and Co., 2005)

This is an entertaining and engaging journalist’s narrative account of the fate of trash in the U.S. Her goal is to find out where her trash, recyclables, and sewage goes and how it all is handled. It is quite an informative and sincere account. She calls up and visits often reluctant managers of landfills, transfer stations, sanitation workers, recycling facilities, and tries to extract information from them. Along the way she provides how trash works, its history, decomposition, and many other facets of our waste. Since she lives in New York City, we get much perspective from this very large city. Many of the people she visits, some quite colorful, are New Yorkers and I couldn’t help attributing to them strong NYC accents and “up yours” attitudes – OK I am stereotyping but some of the dialogue makes it hard not to do! 

The book begins with her joining some Sierra Club members in paddling around the Gowanus canal in Brooklyn, a waterway highly polluted with sewage and industrial waste. But she explored the canal with a diver and dredger. The canal requires pumps to supply oxygen not unlike a fish aquarium does, or else there will be dead zones. This has helped some species of fish, shellfish, and other animals to survive in and near the polluted, oxygen depleted waterway. Royte also analyzes her own trash habits, measuring her trash output by weight and sorting it. She eventually begins composting as well. At the time, recycling was limited in NYC. Glass and plastic was temporarily not recycled. I assume it is now as in most places.

She notes that in centuries past kitchen trash was typically left out for scavenging animals and farm animals and burnables were burned. This is before plastic. Other stuff was repurposed or bartered. While many of us still do these things, the waste stream is no doubt much heavier these days. Apparently, waste in the first 4 decades of the 20th century was composed of 60% wood and coal ash! Since those sources of heat are now far less common so is their waste. The advent of refrigeration actually reduced food waste. However, concurrently, package waste went up. In the 1800’s trash accumulated in thick layers on the streets of populated cities and it included dead animals – in 1880, 15,000 dead horses were reported in Manhattan. While there were sporadic cleanup efforts it wasn’t until 1895 that trash removal service became a thing in NYC. There were three designations then and there: fuel ash, dry rubbish, and “putrescible” waste. Ashes went to an ash dump where ashes were piled high. Dry trash was picked clean of useful stuff and actually used to build up land and fill waterways and wetlands, creating tens of thousands of acres of waterfront real estate. Airports were built on such sites and still have problems with ground settling. Rat and other vermin problems led to more intolerance of trash. Incineration was in vogue in the 40’s. The toxic, stinky black smoke from these neighborhood incinerators made the air hazy and blocked visibility and for all these reasons went back out of fashion. New “sanitary” landfills came about with the Fresh Kills landfill in 1948 in Staten Island which stayed in operation until 2001. The first one was built in Fresno, California in 1937.

Her first visit was to 6AM roll call at the Department of Sanitation. She then got to meet up and help out her local sanitation workers (trash men), or “san men” as they (including women) are called, Sullivan and Murphy. She notes their skill, concentration, strength, and quickness in getting trash in the truck, crushing it, and moving on. They noted that trash and differences in trash tells a lot about the people – what they buy, what they read, and what they eat. People also throw away useful things (as we discovered living in cities) so the scavenging can be good as well. After the truck was filled it was brought to be weighed and dumped at the transfer station.  Knowing how much trash can be put in a packer truck and how efficiently it is packed comes with experience. The quality of the compressed mass when dumped is referred to as the “turd factor.”

On a tangent she notes siting for things like landfills and incinerators are more likely to be in poor communities rather than rich ones, as the environmental justice movement would echo. However, other communities, particularly small rural ones, have welcomed landfills for the monetary benefits to the town which can be quite substantial, not to mention free trash removal. Of course, those benefits will fade through time as the facility fills, is closed, and becomes an environmental liability. For that reason the landfills can be opposed by factions of the local population. Excessive truck traffic is also problematic. Even the local transfer stations in NYC draw complaints about the stink and rats. Metal and a few other valuables are picked out. There are about 450 tractor trailer loads going out of the transfer stations per day. 

Two key problems of sanitary landfills soon appeared: release of methane and toxic liquid leachate. Collection systems for both methane and leachate became required under 1991 amendments to the EPA’s Resource Conservation and Recovery Act (RCRA). This shut down many dumps and consolidated others into larger ones. In 1988 there were nearly 8000 U.S. landfills. By 2002 there were only 1767. The new “megafills” could take advantage of construction economies of scale. Royte was still trying to get visits to landfills – the closed Fresh Kills and one in Bethlehem, Pennsylvania. She ended up paddling around the Fresh Kills area with a salt marsh ecologist to try and have a look at what was once the largest landfill in the U.S. She eventually gets to visit.

“Dry tomb” landfills involved isolating garbage so that leachate flows along plastic barriers and can be collected and perforated pipes and pumps could collect methane and the other gases. A differing method is the “wet tomb” or bioreactor method which enhances decomposition and subsequent methane and leachate production. The leachate can be injected back into the garbage stream to further accelerate decomposition. A downside is that there is more leachate available to leak. An upside is that the increased rate of decomposition frees up more space faster. Leachate is a toxic stew of household products and decomposition products. It is often treated and then released back into the environment. Landfills will leach toxins for vast periods of times. Old Roman waste pits still leach toxins. Landfill liners will leak eventually. Since leachate contains nitrogen and phosphorus and its presence in surface water can be beneficial for plants and sea creatures as they take up these nutrients. The ecologist did a restoration of the shore around part of the landfill to further filter out the leachate with grasslands fed by a surface water capture system. They end up getting chased off by a “sanitation cop” guarding the landfill just for paddling too close.

Next she visits IESI’s landfill in Bethlehem, PA. The manager kinda gives her the runaround showing her the recycling area but saying he was too busy to show her the landfill. I remember in the 90’s when we used to drive our own trash to the local landfill in West Virginia once a week when they allowed it – tipping it by throwing bags from the back of the truck right into the readied pits. You could see all the other pits in various stages of covering and reclamation. She manages to sneak under a fence into the landfill proper but can’t get anywhere near the activity. The manager was quite uncooperative. 

Next is mention of organized crime in the commercial waste hauling business, particularly in New York City. 23 hauling companies were part of a 1995 indictment. This was part of Guiliani’s fairly successful push against organized crime. These corrupt companies were also severely overcharging for commercial trash removal. After the mob was run out the void was filled with consolidation by the larger companies. 

A 1998 study noted that escaping landfill gases contributed to significant increases in bladder cancer in leukemia in those who lived very close to landfills. Other studies noted increases in birth defects. Although these studies weren’t definitive in terms of cause and effect, they were suggestive.
Modern landfills have better containment and collection systems but also cost more to construct and maintain. Layers of gravel, geotextile fabrics, sewage sludge to accelerate decomposition, compacted sand, thick high-density polyethylene geomembranes, and compacted impermeable clay, make up the cells.

Next she visits an incinerator, a modern waste-to-energy (WTE) facility. Such facilities burned about 13% of U.S. trash at time of publication. She watches as trash is tipped and the big metals are removed as much as possible. Then conveyor belts with magnets take away more metal. The facilities are outfitted with state-of-the-art pollution control devices – they have to be since the smoke tends to be especially toxic. Even so, the air emissions are still considered quite toxic and not welcomed by many. While the scrubbers take care of the fly ash, there is also the bottom ash which is full of toxic heavy metals. These are put in standard landfills after treatment but will likely increase overall leachate toxicity.

She visits Nick Themelis at Columbia University’s Earth Engineering Center. He touts the benefits of WTE over recycling, composting, and landfilling, at least from an engineering perspective. There is much debate regarding the relative benefits these four methods of dealing with waste in terms of energy use, potential for environmental harms, cost, and effectiveness. WTE plant costs are high due to siting and pollution control requirements. In the past they seem to have been preferentially located in low income areas, thus feeding the arguments of the environmental justice movement. 
She visits a landfill in New Jersey with an environmental consultant. There was a large building with a tipping floor where municipal trash was compressed into small cubes. Outside she observes the cells being filled with these cubes of trash. 

Next she attends Robin Nagle’s Urban Anthropology class at New York State University. Sanitation workers were speaking including the director of Fresh Kills. Nagle also worked as a sanitation worker in order to ethnographically document sanitation workers. The Fresh Kills director, Diggins, noted that Fresh Kills had finally got up to code just before it was closed. The leachate collection and methane collection and flaring systems had managed to vastly improve the previous bad odors. She and Nagle get to visit Fresh Kills with Chief Diggins. She notes that there is only a faint smell of gas and that most areas are reclaimed quite well on the surface. Due to the extensive mounds of well covered trash reaching up hundreds of feet, she notes the majestic view from the top. She gets to see a leachate seep that is particularly stinky. She also gets a private lecture on leachate from a landfill engineer at DSNY. 

Compacted bagged trash in dry tomb landfills is more like mummification than decomposition as 70-90 year old stuff may not decompose at all, particularly plastics. Wet tomb landfilling takes advantage of anaerobic methanogenic microbes to radically break down the organics but it also increases methane and other gas emissions. Fresh Kills landfill isn’t lined, is in swampy ground with widespread fluid movement compared to other more modern lined landfills. Thus its rate of decomposition is much higher. The methane collected from Fresh Kills runs a gas power plant that can power about 14,000 homes. The WTE plant she visited was said to be able to power about 50,000 homes. Landfills without methane collection systems are susceptible to large vented releases of methane and underground fires that can be hard to contain. She makes an error regarding the amount of methane produced from Fresh Kills (15BCF/year or 40+MMCF/day) relative to world methane production but it is quite a bit. Landfills at the time were the largest source of anthropogenic methane – now they are still close – the top three in the U.S.: landfills, agricultural (enteric fermentation from cows and manure management), and leakage from oil and gas systems are all nearly the same amount – about a third each. Globally, rice paddy farming contributes a significant amount. Raw landfill gas also contains many other toxic gases as well as CO2 so separation, treating, and flaring are also necessary. Landfills, WTE incinerators, and farmers are eligible for government subsidies but oil and gas systems are not. She visits the leachate treatment plant at Fresh Kills where ammonia is treated and released and suspended solids are precipitated out. The treatment plant also produces a significant amount of sludge which is dewatered, mixed with lime, and trucked to another landfill for isolated storage. 

Next she recounts her own home composting project and the subject of composting in general. Her own project was not very successful but she gives some stats of the time (~ 2005) for composting in the U.S. for organics, food waste, and yard waste. Composting requires significant oxygen for aerobic bacteria to decompose the organic matter into organic acids. When sulfurous compounds form later in the process the trash begins to stink – humans are quite sensitive, to even one part per billion, to the odors of sulfur compounds. Odor researchers also found that one’s cultural history affects whether one views smells as disgusting, dangerous, or acceptable. The EPA noted that 67% of U.S. waste could be composted. I have been doing it for years and it is not too much work. If one has animals around, both domesticated and wild, they will help with getting rid of food waste as well. She meets with an urban composting activist and they discuss the economics of composting and anaerobic digesters (ADs), which can produce compost and biogas. ADs can decrease overall greenhouse gas emissions and convert carbon into gas and gas into energy – they are big in Europe and currently taking hold in the U.S. to help curb methane emissions, mainly from farms and food waste. She visits the compost farm at the Lower East Side Ecology Center. It makes 750 lbs of compost from 3000 lbs of raw materials per week. The compost is blended with vermiculite, perlite, peat moss, greensand, and black rock phosphate to make New York City Paydirt, which was going for $1 per pound. The leachate from the composting operation goes down the drain which suggests highly concentrated corrosive leachate is being dumped, possibly directly into the East River. The Ecology Center was operating at a loss but made up the difference with government grants. The director also favored AD, which has a fair shot of being economic even without subsidies or with small ones. That is the only way composting can be economic, many would say. She mentions that food waste disposers which grind up food waste and put it down the drain can end up harming aquatic life with the increased nitrogen loads. However, it can also add nutrients for sewage consuming microbes. Comparing impacts shows there are trade-offs: more food waste down the drain means less diesel-consuming trucks carrying the heavy food waste away. Food waste disposers also use water. Some hardcore environmental groups think water should only be used for drinking and washing and not for transporting sewage or food waste. Food waste collection has been implemented in some places but it is hard not to lose money in such ops. Commercial collection of food waste from restaurants, grocery stores, yard waste, agricultural waste, and even manure can be collected on a larger scale and fed into well-sited commercial ADs. 

Recycling finally took off in the late 1980’s in the U.S. with newspaper and corrugated cardboard leading the charge. She travels with a DSNY recycling pick-up trash truck. She visits a paper recycling plant and goes through the process of cleaning, heating, dewatering, vacuuming, etc. Paper-recycling mills actually produce more short-fiber waste than virgin mills. In 1988, 30% of U.S. paper was recycled. By 2002 it climbed above 50%. Clean white paper can be recycled about 4 times before its short-fiber content is too much. Paper recycling has slowed deforestation. Going paperless in our increasingly on-line realm has also likely slows it.

Next is metal recycling. She visits the Hugo Neu Corporation, one of the largest metal recycling companies in the world. Neu started in New York and now exports bulk scrap steel, much of it to China where it is further shredded and processed. They also deal with car metal. She talks with Wendy Neu about their scrapyards and environmental liabilities. Steel, aluminum, copper, brass, and most metal recycling are very useful and conserve mineral resources. Royte visits their Jersey City scrapyard and sees how the metals are sorted and separated.

Next she explores household toxic waste and recycling. Battery recycling recycles significant amounts of nickel cadmium, nickel metal hydride, lithium-ion, and lead batteries. According to the Rechargeable Battery Recycling Corporation much of it goes to a facility in Pittsburgh which Royte visits next. Here the components are extracted out and sent to supply other local industries: glass makers, battery, and stainless steel manufacturers. Such metal recycling is still polluting but far less so than mining new raw materials. Along with lead batteries, solvents and mercury are banned in landfills, though it is hard to keep them out from household trash. Royte brings her own household hazardous waste to a drop-off site. She notes that managers are quite guarded about where all this highly concentrated toxic waste goes – but usually to industries after sorting and processing. Electronic waste is a huge issue for recyclers. These devices also contain toxic waste which is not good for landfills or incinerators. Big operations use detailed magnet sorting and weight sorting for the e-waste. They note that glass is more a liability than a commodity – they pay smelters to take it, mainly for the lead. Lead, copper, and zinc smelting plants have been linked to local pollution and lead poisoning. Much of our e-waste is shipped to China, India, and Pakistan. Much of the e-waste extraction is chemical extraction and can be associated with safety and health risk to workers and environmental damage in China. The alternative to exporting e-waste is sweatshops – as e-waste recycling is not profitable. Some have advocated and implemented “extended producer responsibility (EPR) where dead products can be sent back or taken back to producers but that involves work, time, and inconvenience. In Switzerland the bulk e-waste is brought back to retailers but the cost of recycling is added into the purchase price. Recycling is not and won’t become profitable and yet it is rarely subsidized like other feel good green projects.

Plastic is the next subject. She visits American Ecoboard, a company that re-melts recycled plastic and reinforces it into plastic composite resin lumber. She visits a municipal recycling facility (MRF), Allied Waste. She gets a primer in classification of consumer plastics, which was developed to assess recycle-ability, HDPE, LDPE (high and low density polyethylenes), PVC, and others. She discusses the effects of bottle bills (paying to take bottles) and the effects on recycling (recyclers often don’t like them because they get less weight in glass and plastic). She discusses the Keep America Beautiful anti-litter campaign (created by beverage companies and known as a prime example of corporate greenwash) and their ambivalence. Evidence suggests that bottle bills do increase recycling rates significantly, or at least had for a while. Only a small percentage of glass bottles are refilled. Only in some Scandinavian countries are PET (polyethylene terephthalate) plastic bottles refilled and only because the PET bottles there are thicker and stronger. Most recycled glass and plastic is re-melted. Other destinations for recycled plastic are sleeping bag fill, carpets, products like the ecoboard, and fleece products like jackets and blankets. In 2003, 35 % of recycled PET plastic was exported, mostly to China for cheaper processing there. She mentions a Greenpeace report that showed that Pepsi and other companies were sending low grade recycled plastic to India where they are processed in ways that here would be considered unsafe and much of them ended up in unlined landfills there. One bottom line is that virgin plastic is still far cheaper and easier to deal with than recycled plastic. Plastic production is associated with toxins such as trichloroethane, acetone, methylene chloride, methyl ethyl ketone, styrene, toluene, sulfur oxides, nitrous oxides, methanol, ethylene oxide, and volatile organic compounds. Benzene and vinyl chloride are inputs. Plastic is also known for being very slow to biodegrade and more recently the issue of micro-beads of plastic entering the environment en masse is seen as a potentially serious problem if not abated. Reduction of packaging is one partial solution to overuse of plastic. Plastic bag bans have met with only marginal success but are currently popular in some areas. Curbside recycling has been quite successful in some areas and more marginally so in others. I live rural and deliver my recyclables to a local recycling center.

Next the exciting world of poop is explored beginning with the once symbolic issue of disposable diapers. One study suggested that disposable diapers made up to 2.1% by weight of landfill deposits. Reusable cloth diapers would increase both water and waste into the sewage system. About 20% of sewage sludge is used by landfills to enhance biodegradation. Next she visits a manager of the DEP Division of Wastewater Management to track the journey of her shit through the pipes. Before the 1980’s her “effluent” would have made it into the Gowanus canal. One problem for sewage treatment is commercial food establishments without adequate grease traps. I once had a job cleaning these grease trap systems which included scraping disgusting grease muck and high pressure spraying from roof vents with caustic soda imbued water. Nowadays used oil can be filtered and used as biodiesel which is another feel good green thing – fine in itself but not enough to go around to make a significant impact. Biodiesel is cleaner than regular diesel but still makes significant pollutants when burned. In times of high water runoff the sewage system gets filled and some effluent (more dilute now) does end up in the harbor. 

“As recently as the seventies, New York was still discharging 450 million gallons of raw sewage a day into the waterways surrounding the five boroughs. Until 1986, the entire west side of Manhattan, north of Canal Street, discharged its sewage [untreated] into the Hudson.” 

In a big rain storm as much as 40% of the raw sewage stream is diverted into local waterways and this is apparently the case in many large cities. Storm water runoff also picks up toxins from the ground, from incomplete combustion, from illegally dumped waste and grease, and from commercial establishments, and adds them to the wastewater system. They followed the gulls, drawn to faint H2S, to the Owl’s Head Treatment Plant where homegrown methane ran engines and 120 million gallons of raw sewage is treated per day. Here solids and liquids are separated and pumped through settling tanks, ‘scum concentrators,’ and aeration tanks. The manager described the sewage treatment plant as a digester that concentrates and accelerates decomposition. The solids as sludge are filtered and used to assist landfill biodegradation and may be spread on farm fields, preferably after further processing. The sludge contains toxins. The effluent certainly contains varying levels of toxins since many toxic substances go down drains. Here the treatment plant also had a large digester and after being digested by anaerobic bacteria the sludge was shipped by barge to a dewatering plant and dried into a raw material product.

“For decades, the DEP dumped 1200 tons of sewage a day from a tanker parked twelve miles off the city’s shore.” EPA announced the waters ‘dead’ in 1985. Oxygen was depleted and shellfish were contaminated with bacteria and heavy metals. Boston as well as New York had practiced ocean dumping of sewage. It was outlawed in 1988 by Congress with the Ocean Dumping Reform Act which went into full effect in 1991. So now the solids had to be further processed and further treated to reduce pathogens. Acceptable levels of lead, arsenic, mercury, and chromium were raised by the EPA, say some, to accommodate sewage so it could be reclassified. It was renamed ‘biosolids.’ Several new sewage-based fertilizer local products came from various cities. At least one tested very high in cadmium. She gives stats for dried sewage sludge at the time (~2005): 54% relabeled “biosolids,” 28% buried in landfills, and 17% incinerated. After ocean dumping was banned a company (Merco) landed a contract to spread this New York waste on ranch land in western states but several states banned it until a donation was made to Texas Tech to study sewage and dump it at Sierra Blanca, a small town in southwest Texas where a 81,000 acre site was used for the sewage farm. The smell was horrendous according to many and the waste tested very high in fecal coliform bacteria. I think it was some sort of quasi-political scandal as well regarding the place selection. Dry sewage pellets also went to citrus groves in Florida and corn and soybean farms in Ohio. Processing facilities often have bad smells even though the manager there in NYC was an expert in odors and how to neutralize them – one method is a regenerative thermal oxidizer which raises temps to over 1600 deg F. Even so, people still complain about the smells and are affected by them. NYC sewage sludge contains magnesium, cadmium, copper, zinc, iron, mercury, selenium, and lead (leached from pipes). Some of those are toxins but several are plant nutrients at typical concentrations as well as nitrogen and phosphorous. It is also high in the toxin dioxin – the 2nd highest source after backyard trash burning. There are allowable levels for Class A biosolids (300 ppm) and various sewage end-products are allowed or not allowed to be applied for food producing agriculture. EPA’s sewage sludge regulations have been criticized. I am unaware of any developments after publication of this book. There is anecdotal evidence associating class B biosolids with all kinds of medical problems but such ‘evidence’ is common with any industrial contact so it is hard to know which cases, if any, are legitimate. However, there are some definite cases of bacterial poisoning of cattle and staph infections so sludge exposure poisoning is a real issue. Since sludge is not always a consistent product it stands to reason that some batches could be considerably more toxic than others. The EPA denied a petition to ban land application of biosolids in 2004. Sewage-treating marshes were in vogue for hip small cities but such wetlands are impractical in most places as well as a liability.

Next she visits a homemade grey water treatment zone and delves into waterless composting toilets. I looked into the toilets years ago but at 30 times the cost of a toilet plus maintenance requirements and potential odor issues it seemed nonsensical. The do-it-yourselfer guy she visited made his ‘humanure’ into compost aided by worms and used it on ornamental plants. Composting of human excrement is often recommended over new sewage systems in developing countries due to having less overall environmental impact if done correctly. She visits an art exhibit with her young daughter which shows and replicates the human digestive system through Plexiglas. They sit near the anal sphincter!
Next she examines consumerism and the possibility of decreasing our waste stream. Less packaging has been one trend to address waste volume and weight. Curbside recycling is another. She examines obsolescence, planned and not. There is functional obsolescence (like say faxes) and style obsolescence. Decreasing the amount of waste per product life can be achieved through life-cycle analyses, say the authors of a study culminating in the book, Cradle to Cradle (I may read that one). Consumption reduction is obviously the most impactful solution but with increasing population and increasing development of developing countries that is not likely to happen overall. “Corporate citizenship,” or rather perceptions of it among consumers, influences buying decisions – unless the cost variations are too extreme. She goes through the effect of holiday consumerism and even the curbside pickup and drop-off of spent Christmas trees. She goes on a tree collecting run with DSNY. Royte shares her own feelings about being green and how it makes her feel more useful. I think that’s fine as long as it doesn’t devolve into pointing fingers at others too much. 

She visits a 2-day roundtable about recycling by the Citywide Recycling Advisory Board and others. Industry and environmentalists were present. Producer responsibility was brought up and other recycling issues were talked about and debated. She meets the director of San Francisco’s recycling system and flies out for a visit to see the new $38 million MRF – lured by the possibility of learning more about the ‘zero waste’ concept. Of course, she is not so naïve as to take zero waste literally but more as a guiding principle. The goal in San Francisco was to maximize recycled content and minimize landfilled content. I think maybe zero waste is a ‘cart before the horse thing’ – that maybe less waste should precede zero waste like increased renewable energy should precede 100 % renewable energy and less emissions should precede zero emissions. She explores with a manager of the waste management company Norcal. She spends (a few months?) exploring San Francisco’s garbage system and compares it to New York’s: San Fran is a pay-as-you-throw system separated into black = rubbish, blue = recyclable, and green = organic. Pay-as-you-throw disproportionately burdens lower income residents. It also increases deliberate dumping. Recycle bins encourage theft by bottle redeemers. Collecting food from restaurants increased San Fran’s diversion rate by 15% and this source stays isolated to be composted at a landfill and used to produce crops that are in turn bought by the same restaurants, thus “closing the loop,’ as the MRF manager noted. Anti-recyclers have argued that materials are getting cheaper while labor is getting more expensive so recycling is not worth the bother. Better automation and industrial technology improvements could make recycling more economic and a desire to recycle among the populace should keep streams of it coming through. Mandatory recycling will also help increase the diversion rate. Recycling costs and mandates on businesses incentivize packaging reductions. The MRF was a vast complex with 87 conveyer belts moving recyclables. She wonders if the line workers (as at the Hugo Neu Corporation) were happy with their jobs and their levels of possible exposure – to what I am not sure except maybe toxic dust. Recyclable sorting can vary depending on what materials have the most demand for given areas. PET plastic was the most valuable at the MRF but much of it was exported rather than used locally. Norcal is paid for garbage collection (not the city as in DSNY). They also pay to tip at the landfill so their recycling facilities decrease tipping fees as well as increase revenue by selling recyclables. Apparently. It is still cheaper to make glass from old glass rather than virgin silica (less heat required) so recycled glass still has valuable markets. Next they visit Norcal’s subsidiary organics composting facility where mostly commercial food waste was shredded. These were apparently diesel powered industrial composters with high greenhouse gas and VOC emissions – not anaerobic digesters. The final compost, or organic fertilizer, was sold to wineries, organic farms, and landscapers. She also visits a famed recycling center in Berkley – Urban Ore, an “urban junkyard.” This is a large and well-vetted re-use/re-purpose facility which favored ingenious and creative ways to divert waste into useful products and also just sorted and held on to used stuff until new owners and uses were found for them.

In the final chapter Royte contemplates the role of an “ecological citizen.” Garbage researchers have called sorting garbage a Zen-like societal experience. I can attest to once learning the Zen of dumpster diving! Her own new garbage logging, composting, and recycling habits changed her perspective about waste. It also changed her habits regarding trash placement so as to be kinder to those who take it away. She mentions conversations with a PhD sociology candidate that thinks recycling is pointless and diverts attention from the real problem – consumers and capaitalists. Statistics say that municipal solid waste is just 2 % of the nation’s waste, that the rest is commercial, from agriculture, mining, and industry, and much of it (~ 75%) is in the category of non-hazardous industrial waste. Much of all this waste is really the waste it took to create the consumer products that produce the municipal solid waste. So basically our waste would increase by 50 times if we included a cradle-to-grave waste stream of the products we buy and use. Such a ‘multiplier effect’ suggests that reducing one’s waste stream by any amount reduces that amount much more in the full life cycle. However, that is not always the case since some of the things we buy (like say gasoline) produce no waste that we can see - of course we can buy a more energy efficient low emissions vehicle to decrease carbon emissions and polluting waste. All kinds of technologies are currently being explored for processing trash. Most are energy intensive but some generate and run on their own energy like anaerobic digesters that often run on the biogas they produce. She participates in a beach cleanup and even though such cleanups merely slow down the trash stream and clear it temporarily, it is still dutiful for an ecological citizen to do it – reminds me I am a bit behind on my road cleanup. It is volunteering, community service, and we should all do some. 

Garbage Land is a thoughtful book. The narrative is often entertaining, and shows the personal demeanors, character, and biases of the people Royte meets to discuss and explore waste. It is rife with data, speculations, still current debates, and the realities of human waste.  
     



    




Friday, August 12, 2016

The Grand Design - by Stephen Hawking and Leonard Mlodinow



Book Review: The Grand Design – by Stephen Hawking and Leonard Mlodinow (Bantam Books, 2010)

This book was an absolute delight to read and I never thought I would say that about a physics book. I read Hawking’s A Brief History of Time a few years back and found it rather unsatisfying. This one, however, was fascinating, fairly easy to follow, well-illustrated, and was even peppered with good natured nerdy humor and comics. It also delves into the philosophical aspects of physics and the nature of reality. The book doesn’t always seem to make sense but it does offer a good non-technical explanation of the current state of physics along with the perhaps more important qualitative aspects of what it all means.  
  
The authors point out that science increasingly informs and influences philosophy as new discoveries are made. Quantum physics shows that reality in the subatomic quantum level is fundamentally different and yet quite predictable. Quantum physics and classical physics are based on very different conceptions of physical reality. Richard Feynman’s idea of multiple histories is one of a number of quantum physics ideas that seem to defy common sense. The authors here adopt an approach they call – model-dependent realism. It is based on the idea that we humans interpret sensory inputs by making a model of the world to explain reality. It may be the case that more than one model could explain reality. The history of science is one in which better and better theories, or models, have come about to explain reality. The authors think we now have a model that is a candidate for the “ultimate theory of everything,” which they call M-theory. They describe M-theory as a family of theories that is more like a map. M-theory predicts that many universes were created out of nothing and in each there are multiple histories. Our very existence selects only those universe(s) that are compatible with it. 

A short history of science is given from Thales and other pre-Socratic philosophers through classical Greece. Laws and relationships were then first tabulated with the string length harmonics attributed to Pythagoras and the principles of Archimedes regarding buoyancy. Anaximander, circa 610-546 BC, reasoned that humans ‘evolved’ from other animals since the first human infant would have been helpless and not able to survive. Empedocles discovered properties of air and fluids. Democritus divided matter into smaller pieces but postulated a limit to the division. Have we found it? Maybe. Maybe not. His ideas of the fundamental units, or atoms, crashing into one another are a precursor to the law of inertia. Aristarchus (circa 310-230 BC) reasoned that the sun was much bigger than the earth by geometrically measuring the shadow cast by the earth on the moon during eclipses. To him is also attributed the first heliocentric model and the suspicion that stars were distant suns. Based on these ideas and perhaps others he was said to have thought that humans were not so special in the universe, something modern science seems to suggest often. The ideas above were often not even popular in their time as there were many rival ideas. Was this idea that we are not special new? Probably. Of course many ancient explanations about nature were way off the mark and entwined with mythology. Aristotle was more influenced by direct observation but he still incorporated much nonsense. The authors go on through Kepler and his scientific (and religious) explanations of the movements of the heavenly bodies. The Church was still very powerful in Europe when in 1277 the pope noted that the idea that nature followed laws was heretical. Indians and Arabs developed mathematics. Kepler and Galileo are credited with the resurgence of the idea of laws of nature and Descartes believed that all physical phenomena could be explained by collisions of moving masses. His three laws became the precursor to Newton’s laws of motion. All of these scientists also sought to reconcile their observations with God and religious dogma. After all, it was God who made the laws of nature, right? 

“Today most scientists would say that a law of nature is a rule that is based upon an observed regularity and provides predictions that go beyond the immediate situations upon which it is based.”
Today, laws of nature are typically phrased in mathematics and generally hold universally or at least under a stipulated set of conditions. Newton’s laws hold until the objects in question approach the speed of light – then they must be modified. An unexplainable exception to a natural law might be considered a miracle or simply an unknown. Pierre-Simon, marquis de Laplace (1749-1827) is credited with postulating scientific determinism: “Given the state of the universe at one time, a complex set of laws fully determines both the future and the past.” Others believe humans have free will so that determinism is only a feature of the non-human universe. The authors seem to favor the view that we are basically biological machines subject wholly to the laws of nature and that free will is basically an illusion. Even if we are governed by determinism there is such complexity that outcomes are difficult and often impossible to predict. Science and physics relies on ‘effective theories’ that successfully predict outcomes of experiments which only include small bits of the total of a system. Since we cannot document and describe all the details that lead to our behaviors or even predict what those behaviors will be we rely on the effective theory of free will, say the authors. They also note that most scientists would say that the laws of nature are a “mathematical reflection of an external reality that exists independent of the observer that sees it.” That is, the authors say, if an objective reality even really exists. 

Next, reality is explored. Ptolemy’s earth-centered model of physical reality was the most accepted because it seemed sensible and correct and could be partially explained scientifically. This model ruled in Western belief for 1400 years. Copernicus revealed a heliocentric model 1700 years after Aristarchus. The authors here note that in some sense both models could be considered real although the heliocentric model has much more evidence and mathematical support. They explore the idea that we are living in some sort of simulation, like in the Matrix movies. They note that we, like beings in a simulated world, cannot see our universe from outside of it. Thus, they claim – all realities are ‘model-dependent.’ “There is no picture- or theory-independent concept of reality.” Thus, they adopt the idea of ‘model-dependent realism.’ Realism refers to a situation where all observers studying a system will measure the same properties of it. Quantum physics makes any realism a difficult position to defend since on the quantum level the observer alters the observed. One theory – the holographic principle – suggests that our 4-dimensional universe is a ‘shadow’ on the boundary of a larger 5-dimensional space-time. The analogy is that we are like a goldfish in a curved bowl experiencing a distorted view of the reality outside of it. There are also ‘anti-realists’ who say that if it can’t be observed it doesn’t exist. Such was said for atoms when they were proposed but the evidence for their existence has grown vastly since they were proposed. Thus, since the evidence for their existence is so strong we are compelled to act as if they exist without actually seeing them (an idea from philosopher David Hume). That is the situation with the subatomic world. In model-dependent realism the arguments for whether a model is real or not are put aside and only the notion whether the model agrees with observation is considered. Thus, two or more models could well be correct in agreeing with observation. All knowledge really has a subjective component since we observe with our senses which are processed by our brain which builds models in the form of mental pictures. Regarding the subatomic world we build models based on mathematical observations and what we do know about larger units of matter through observation. We can’t see individual particles or electrons but we can see the effects they produce. Quarks are a model to explain the properties of protons and neutrons in the nucleus of an atom but we will never see one because free quarks theoretically cannot exist in nature. On this basis, whether quarks actually exist or not, has been debated since they are theoretically forever unobservable. However, the quark model has led to many correct predictions of experimental results. Thus the quark model agrees with our observations of how subnuclear particles behave.

There are models related to the beginning of the universe as the Big Bang and ones that go back to before the Big Bang. The Big Bang beginning model is favored simply because it has observable consequences for the present and the latter does not (as of yet anyway). The authors note that a model is a good fit if it:

1.       “Is elegant
2.       Contains few arbitrary or adjustable elements
3.       Agrees with and explains all existing observations
4.       Makes detailed predictions about future observations that can disprove or falsify the model if they are not borne out.”

The idea of elegance is related to having few adjustable elements since a theory with many ‘fudge factors’ is not very elegant. According to many the “standard model” itself is inelegant, having too many adjustable parameters fixed to match observations, although it still has successfully predicted results and the existence of new particles. When too many factors are attempted to try and rescue a theory it is deemed inelegant. Sometimes a new model may be required. An example is given of the old model of a static universe and Edwin Hubble’s research of the light emitted by galaxies which suggested that the universe is actually expanding and not static. Hubble had expected to observe a static universe with as many galaxies moving away from us as toward us but he observed nearly all galaxies moving away from us and the further away from us the faster they moved away. It is not just the transformation from Newtonian physics to Einsteinian and quantum physics that represents a paradigm shift but all new theories replace old ones in similar fashion with variable levels of fundamental shifts. Einstein’s photoelectric effect showed that light behaves as both a wave and a particle depending on how it is observed.

The authors note that thus far there is no single mathematical model or theory that can describe every aspect of the universe. The network of theories known as M-theory can describe phenomena within a certain range. While this does not satisfy the search for a single unified theory it is acceptable in terms of model-dependent realism.

Next the “alternative histories” quantum theory is explored where the idea of quantum superposition suggests that every possible version of the universe exists simultaneously. Newtonian laws are seen as an approximation of the way macroscopic objects composed of quantum components behave. The wave/particle duality of light and the uncertainty principle are two important aspects of quantum theory. The wave/particle duality was discovered in the famous double-slit experiments. The uncertainty principle was resolved mathematically by Werner Heisenberg. Basically it says the more precisely you measure speed of a particle the less precisely can position of the particle be measured, and vice versa. Due to the extremely small number that describes the uncertainty principle, Planck’s constant, it only noticeably affects particles on the subatomic level. In our big Newtonian world speed and position can be measured just fine with the uncertainty principle too small for us to notice.

“Quantum physics might seem to undermine the idea that nature is governed by laws, but that is not the case. Instead it leads us to accept a new form of determinism: Given the state of a system at some time, the laws of nature determine the probabilities of various futures and pasts rather than determining the future and past with certainty.”

Unlike everyday probabilities, quantum probabilities reflect a basic randomness in nature. Feynman  noted that no one really understands quantum physics. Even so it agrees with observation and has been well tested. Feynman developed the alternative histories formulation of quantum physics. His mathematical expression - the Feynman sum over histories – assumes that a particle traveling from point A to point B takes all possible paths simultaneously. The shortest path between two points is a straight line in the classical physics model. In the sum over histories models it is as well since the interference patterns enhance one another in the more straight line paths and cancel one another out in the diverging paths. Thus the reality of classical physics “seems” to be correct, especially with large objects.

‘Observation alters the observed’ is the gist of the uncertainty principle. Knowing one thing in the quantum world involves unknowing others. The concept of the past in quantum theory is that it is indefinite, based on possibilities, and that it has no single history. Thus, present observations affect the past. Physicist John Wheeler demonstrated this in his “delayed-choice” version of the double-slit experiment. 

In a discussion of the laws of nature leading to a theory of everything the authors go through the development of theories concerning the four fundamental forces: 1) the gravitation force; 2) the electromagnetic force; 3) the strong nuclear force; and 4) the weak nuclear force. Especially explained is the electromagnetic force which is the characteristic also of light on the EM spectrum. Einstein’s conclusion that the speed of light appears the same to all uniformly moving observers required a new interpretation of space and time. Thus it was discovered that space and time are intertwined and the new 4th dimension, ‘space-time’ was born. That space-time is curved became the basis of Einstein’s general theory of relativity which is a major upgrade to Newton’s theory of gravity. The curvature is non-Euclidian and resembles the shorter distances traveling by air from two points on the Earth as a ‘great circle.’ Thus objects move on ‘geodesics’ in a similar fashion rather than along straight lines since space-time itself is curved. The authors note that both Maxwell’s theory of electromagnetism and Einstein’s theory of gravity are still classical theories, or models in which the universe is assumed to have a single history. These theories work in our everyday world – Einstein’s general theory of relativity is used to keep GPSs and aircraft on track to account for space-time based curvature. However, in order to study the subatomic world we need quantum versions of these theories which are not based on single histories but alternate ones of varying probability. Feynman and others developed the first one base on electromagnetism called quantum electrodynamics, or QED. This involves interactions of particles called bosons (particles of light, or photons, are an example of a boson) and fermions (electrons and quarks are examples of fermions). Feynman developed a graphical way of representing the sum over histories, now known as Feynman diagrams. However, there is a problem with this approach as the Feynman diagrams would give an infinite mass and charge to electrons which is not the case when measured. To adjust for this there is a process called renormalization where ‘fudge factors’ are used which can be considered mathematically dubious. Renormalization is an essential ingredient in QED and QED has been successfully used for subatomic prediction. 

The authors note that the division into the four fundamental forces may be artificial and that physicists have long sought to unify the four classes into a single law, or theory of everything. The electromagnetic and weak nuclear forces have been unified into the electro-weak force which does not require renormalization and has successfully predicted particles verified by CERN. The strong force can be renormalized on its own in a theory called quantum chromodynamics, or QCD. Here the quarks that makes up protons, neutrons, and other particles have a property called ‘color.’ QCD also makes use of a property called asymptotic freedom whereby the strong forces between quarks are small when they are close together and large when they are far apart – as if they were joined by rubber bands. Grand unified theories (GUTs) where developed to unify the electromagnetic, weak, and strong forces but were struck a hard blow in 2009 when it was apparently revealed experimentally that proton decay is greater than 10 to the 34th power. In the earlier adopted ‘standard model’ the electro-weak forces are unified but the strong force as in QCD theory is considered separate. The standard model is yet unsatisfactory because in addition to not unifying the strong and electro-weak forces it also fails to unify the gravitational force. Integrating the gravitational force is more difficult due to the uncertainty principle. The uncertainty principle predicts that space cannot be empty, only at a state of minimum energy called the vacuum which is subject to what are called vacuum fluctuations, or ‘quantum jitters.’ Now I am confused! While renormalization can remove infinities it cannot adjust the infinity parameters of gravity theory. The theory of supergravity, based on supersymmetry could be an alternative explanation that relies on force and matter being two facets of the same thing. That means that each matter particle such as a quark has a corresponding partner particle that is a force particle such as a photon. Although many physicists think that supergravity can unify gravity with the other forces it is near impossible to verify. The partner particles have not even been observed. 

The idea of supersymettry developed from early formulations of string theory. The extra dimensions predicted by string theory are considered to be so ‘small’ as to be undetectable, analogous to a tiny tiny straw that is essentially a straight line to observation as the curved surface is too small to have visual significance. Different string theories which showed different ways of curling up the extra dimensions as well as supergravity are suspected to be different approximations of a more fundamental theory – called M-theory. It is unknown whether M-theory exists as a single approximation or a network of theories that all agree with observation. M-theory predicts 11 dimensions (10 of space and one of time) and allows for different universes with different apparent laws (whatever that really means!). The 4 dimensions are the ones that are most applicable to us with the other 7 curled up so much as to be rendered insignificant.

Hubble’s evidence for an expanding universe of course strongly implied that the (current) universe had a beginning, known as the big bang. The famous analogy is that it is as the surface of a balloon is expanding. Due to gravitational forces everything within the expanding universe does not expand but keeps its size. Evidence for the big bang includes cosmic microwave background radiation (CMBR) that hints at a hot early universe and helium abundance. At the beginning there was a singularity, where the temperature, density, and curvature of the universe were all infinite. General relativity theory does not work for the very early period of the universe’s existence. The early expansion of the universe, called inflation, had to move way faster than the speed of light. The idea is that if you go far back enough in time the universe was small – atomic scale small – so that the relativity theory that works in the macroscopic world breaks down and the microscopic world is governed by quantum theory. Thus “creation” or the big bang had to be a quantum event. In that early quantum-sized universe there were 4 dimensions of space and none (yet?!) of time. Thus, time behaved as another dimension of space due to warpage. This somewhat resolves the problem of whether time had a beginning similar to the round earth resolving there being edges to the seemingly flat earth. This “no boundary condition” of space-time at the beginning of the universe sort of resolves the idea of whether the universe had a beginning or not, they say – but I don’t fully understand it. It also removes the idea that the universe had to be set in motion by an entity or force, say God. A uniformly expanding universe will have the greatest probability due to its sum over histories but other universes in the multiverse would be more non-uniform/irregular/asymetrical. Our slightly non-uniform universe (as inferred by slight temp variations in the CMBR) is lucky for us, they say, since it allows for heterogeneity of matter. 

“We are the product of quantum fluctuations in the very early universe.”

The sum of alternative histories of the universe (which themselves are governed by probabilities) leads to the appearance of being dominated by a single history that we tend to take as “the” single history. 

“We seem to be at a critical point in the history of science, in which we must alter our conception of goals and of what makes a physical theory acceptable. It appears that the fundamental numbers, and even the form, of the apparent laws of nature are not demanded by logic or physical principle. The parameters are free to take on many values and the laws to take on any form that leads to a self-consistent mathematical theory, and they do take on different values and different forms in different universes. That may not satisfy our human desire to be special or to discover a neat package to contain all the laws of physics, but it does seem to be the way of nature.”

A universe among other universes in which beings like us exist appears to be extremely unlikely and the rarity and special fine-tuning suggests that we could be a miracle, or more likely an apparent miracle. 

So, next is examined the so-called Goldilocks Principle – the notion that our universe is oddly “just right” for life in several ways: orbital eccentricity, axial tilt, the sun’s mass and our distance from it, our single sun, etc. These coincidences seem suspicious but we now have discovered that there are many other similarly favorable locations in the universe where life could theoretically develop. We are bound to find that the environment in which we live is required to support life. That itself leads to a principle:

“Our very existence imposes rules determining from where and at what time it is possible for us to observe the universe.”

That is known as the ‘weak anthropic principle.’ It gives ranges where life is likely. It is acceptable to most scientists.

The more controversial ‘strong anthropic principle’ “suggests that the fact that we exist imposes constraints not just on our environment but also on the possible form and content of the laws of nature themselves. The idea arose because it is not only the peculiar characteristics of our solar system that seem oddly conducive to the development of human life but also the characteristics of our entire universe, and that is much more difficult to explain.”

In order to make us possible, heavier elements like carbon had to form inside the heat of stars and then some explode in supernovas dispersing the heavier elements throughout space. The early universe (the first 200 seconds) was mostly hydrogen with some helium and a smaller amount of lithium. Heavy elements came much later. Carbon is created by the ‘triple alpha process’ from the nucleus of helium isotopes, the ‘alpha particle.’ There is a unique quantum state of the carbon isotope formed, known as carbon resonance, which vastly increases the rate of the nuclear reaction. This process requires that the fundamental forces of nature (the so-called four forces) be nearly exactly what they are – a case of serendipity? Fundamental constants and masses of particles also have to be in an extremely narrow range to develop life so these are further supports to the strong anthropic principle. 

“The laws of nature form a system that is extremely fine-tuned, and very little in physical law can be altered without destroying the possibility of the development of life as we know it. Were it not for a series of startling coincidences in the precise details of physical law, it seems, humans and similar life-forms would never have come into being.”

So it seems perhaps that the universe (or at least our universe) is fine-tuned so that an observer would eventually discover its laws. The most convincing evidence for the strong anthropic principle, the authors note, is Einstein’s cosmological constant. Einstein called including it in his theory the greatest blunder of his life. However, in 1998 it was resurrected after studying distant supernovas. Its precision is uncannily precise – it is related to repulsive forces and if even slightly different the universe would have blown apart before the formation of galaxies. This strong anthropic principle has led some back to ideas of God and intelligent design. However, modern physics renders the remarkable unremarkable through the idea of multiple universes – the multiverse concept, of different universe with different physical laws. I think the idea is that we are so bound up with the laws of our own universe that it gives the mere appearance that there was an intelligent creator but we might ask “who made God?”

The idea of model-dependent realism suggests that the only reality we can know is made of mental concepts thus all tests of reality are model-dependent. In 1970 John Conway invented the Game of Life based on certain simple laws that led to complexities and those complexities led to different laws on different scales not unlike the different macro and micro (quantum) laws we understand in physics. The takeaway is that: ‘reality is dependent on the model employed.’ The authors note that we say that any complex being has free will – “not as a fundamental feature, but as an effective theory, an admission of our inability to do the calculations that would enable us to predict its actions. Classical and quantum physics and other sciences as well have indicated that our universe has both fundamental laws and apparent laws. The second law of thermodynamics – that energy must remain constant and zero – implies that any “body” of matter in the universe is stable because it has positive energy.  
 
“Because gravity shapes space and time, it allows space-time to be locally stable but globally unstable. On the scale of the entire universe, the positive energy of the matter can be balanced by the negative gravitational energy, and so there is no restriction on the creation of whole universes. Because there is a law like gravity, the universe can and will create itself from nothing ….. Spontaneous creation is the reason there is something rather than nothing, why the universe exists, why we exist. It is not necessary to invoke God to light the blue touch paper and set the universe going.”

“ … M-theory is the only candidate for a complete theory of the universe. If it is finite – and this has yet to be proved – it will be a model of a universe that creates itself.”

Wow, my brain hurts, but in a good way. Mind blown but yet still holding together. Because gravity, blah, blah, blah. Great book!