Arya Paliwal
April 1, 2025
Dr. Mary Naydan
FRS 142
Reflection #3
On Thursday, March 27 we went on a field trip to Princeton’s High Performance
Research Computing Center, which is one of the control centers that serve as a foundation for
Princeton’s technology. The building on the outside looked block-like and the interior a system of
steel pipes of different colors and large boxes with extremely high voltages. The most notable
area we visited was the white room, which consisted of large black and white boxes that
contained GPUs for AI computation and systems powering other computing devices. I was
taken aback by the scheer blandness of such a computationally heavy area. This reminded me
that the “personalities” that we assign to our AI technology are merely a product of humanity’s
everlasting wish to infuse humanity into another non-human being. This is proved by the series
of science fiction novels we have read: Pygmalion's wish to infuse life into its statue, Maltzer’s
experiment to implant Derdire’s brain into a robot system, Victor's wish to construct a conscious
being from unalive parts. Princeton has a tiny fraction of the GPUs compared to Big Tech such
as Meta and Google. Seeing such a large room encompassing such a relatively small number
made me reflect on the scale of energy consumption that Meta and Google undertake and
raised concern about its effects on specific parts of the world.
Oftentimes, “the cloud” is considered a vague term to describe where information is
stored; however, as we saw at HPRCC, it is actually a series of intricate wires and boxes that
store information. This field trip was a grounding experience that allowed me to see how our
information is stored and realize that our most essential safety processes such as PSAfe, Prox
log-ins rely on these series of bland-looking network systems. I found Kate Krawford’s
statements about AI and energy consumption to be modelled by the HPRCC facility: the
energy-consumption nature of these networks is not visible like manufacturing buildings that
explicitly showcase energy consumption and pollution by the production of smoke. Humans
have considered the prospect of AI thinking of solutions to climate change, food shortages and
Arya Paliwal
April 1, 2025
Dr. Mary Naydan
FRS 142
other social and environmental issues. However, I think we are straying away from the reality
that the mere presence of these AI systems partake in the detriment of our social problems.
Simply stated, humans are using immense power to determine how to save power. But on the
other hand, I understand that AI systems are not necessarily computationally efficient yet and
that we will need to build larger white-rooms until we crack the code for efficiency. It seems as if
humans are willing to pay the price for the sake of creating autonomous responsible systems.
On this note, I also wonder how the initial ChatGPT was trained and where the power necessary
for this was extracted. On a broader note, where is the power in this AI arms race being most
commonly extracted from and how does this impact the people living in the area? Because of
this field trip, I can extend the black box metaphor for AI to include environmental accountability.
More specifically, AI consuming immense power results in a need to produce more power
sources. However, the harm is not visible in the software itself, allowing organizations to more
easily distance themself from the ethical implications of their company. I also wonder, is it easier
for AI platforms to greenwash themselves? What does the presence of this “black box” change
about how greenwashing investigations will be conducted in the future? I anticipate that
greenwashing investigations will not change in format, but will become more difficult to conduct.
The high power consumption of GPUs suggests a need for a more sparsely distributed network,
some which may be difficult to trace. I assume that greenwashing and ethical practice
organizations have tracing methods, at which point the problem converges to common
investigation practices. Overall, I realize that the new form of the internet, Generative AI will
take more computational power than our current internet. Part of me wonders why we would
want to take away power from different communities for this endeavor and part of me realizes
it's just for the sake of human development. At this point, i’ll continue to keep updated with the
AI wave and tug-of-war to see who ultimately gains the credit for pushing our AI development
forward.