DCSUN: A Planet Forward Project

Summary

This was a final project I created with a fellow student, Leor Reef, for a class taught by Professor Frank Sesno and featured on Planetforward.org.  Besides creating the video from conception to completion, we had to create an entire multimedia package including social media and blog copy.

The video is the story of Anya Schoolman, a mother, member of the Mt. Pleasant community in Washington D.C., and leader of one of the largest community solar projects in the country.  Learning about the grave damage to our environment caused by fossil fuels, Schoolman and her son decided to take advantage of the plummeting costs of solar power.  Together with neighbors and other citizens of the Washington D.C., she created DCSUN as well as the Community Power Network, which helps communities across the country build out their own solar systems for residential use.

From Planet Forward:

Today, Schoolman has taken the idea of uniting neighborhoods in pursuit of clean solar power national with the Community Power Network, an organization that brings solar cooperatives around the country together to share in techniques and methods.  We had the opportunity to meet Anya, see her solar array surrounded by all her neighbors’ panels, and learn about her work to make solar power accessible to everyone.  For Schoolman, solar power is not the energy of the future; it’s the energy of the present.

The Specs:

  • Camera: Sony NX-5
  • Format: 1080p24 AVCHD
  • Editing System: Final Cut Pro X
  • Computer: Macbook Pro 15″ 2011
  • Location: Mt. Pleasant, Washington D.C.
  • Date: November, 2013
ubuntu-logo14

Should 1:1 programs look to Ubuntu over pricey alternatives?

The following content was originally published in Education Dive, an education-focused publication created by Industry Dive.  To view the piece in its original context, please visit here

By Gabriel Salkin

As schools move forward with 1:1 rollouts of iPads and notebook computers for personal and academic use, the computer lab is no longer the epicenter of digital education. Students today are growing up with computers that are easier than ever to use, and navigating Windows or OS X has become second nature.

So why are some schools upsetting the apple cart by using Ubuntu ​— which gained some notoriety in 2009 when a Wisconsin community college student claimed it led her to drop out of classes — instead?

https://www.youtube.com/watch?v=5Qj8p-PEwbI

Ubuntu (pronounced oo-bun-too) is a version, or “distribution,” of Linux, a free and open source operating system that underpins a vast array of operating systems used throughout the world.  Ubuntu, created and maintained by Canonical Ltd., is one of the most popular distributions due to its strong support, completeness out-of-the-box, and ease of use.

Linux’s presence in the consumer space is minuscule compared to its commercial competitors. It functions differently from Windows or Mac OS X, requiring new conventions to be understood. Due to its open source nature and “geek friendly” design, many problems are solved using the command line and coding languages, which are not as easy to understand as the user interface-driven interactions most consumers are comfortable with.

In spite of this, there are several reasons more schools should not immediately dismiss Linux.

1. It is extremely cost-effective.

The most obvious reason is cost. Linux distributions like Ubuntu start and end at $0, regardless of the volume of purchase – a price Microsoft and Apple cannot match. Linux operates under the tenet of providing software for free, and often comes preloaded with many free and open source alternatives to popular commercial software like Microsoft Office.

2. It runs effectively on older machines, giving it a longer lifecycle.

Linux is also known for compatibility and stability.  Linux software works on nearly every desktop distribution, and future updates rarely break that interoperability.  Distributions can be specially modified to run smoothly on older hardware than Windows or OS X because of its far less demanding requirements. This means hardware purchased by schools can last longer on Ubuntu – a detail that should come as welcome news considering some schools are now spending upwards of $20,000 a year on computer hardware. Why not get as much mileage out of that investment as possible?

3. It gives students the opportunity to learn high-demand coding skills.

Beyond cost, however, using Linux is a new opportunity for students who want to learn to code and create content with their computers. Linux is used overwhelminglyin data-intensive industries. It is the OS of choice for special effects and animation studios like Pixar and Dreamworks. As of 2013, 83% of enterprise applications rely on Linux.  Additionally, most websites run on servers running Linux or its forefather, Unix.

Why does this matter to schools? In recent years, there has been a major call for schools to make coding a core part of the curriculum. This has been reiterated by the Obama Administration and also led to the Hour of Code in December, with several organizations challenging students to develop software in an hour. If we are to encourage students to learn to code and flourish in the digital economy, their first introduction to an important software tool and skillset should be before college.

The open nature of Ubuntu gives users the ability to look “under the hood” and start to understand how the software is built and modifiable. Any part of the software can be changed by the user, encouraging them to explore and experiment. Ubuntu is both ready to go out-of-the-box and modifiable with hundreds of extensions and packages.

There are also challenges in finding instructors who can understand and teach Linux, but introducing students to the software — and breaking them out of the walled gardens of OS X and Windows — can help educators demonstrate the wide array of computer science possibilities to pupils. While it isn’t the only way to encourage coding, its potential classroom impact should not be underestimated.

All Good Things…

Friends, readers, colleagues, family.  Today I announce the formal end of Digital Gravitas’s publication.  It is with a heavy heart that I make this decision and it is one that I have struggled with for many months.  There are a myriad of reasons that I can’t continue this blog in the way I once did: with passionate vigor and obsessive love.  This is, without a doubt, one of the greatest things I have done in my young life and I can only hope that the skills, acclaim and ideas that I have acquired and spread outlast my active publication.

So much has changed in the four years since I began this website.  We have seen the rise and fall of giants.  The creation of entirely new markets and the demise of stalwart institutions.  In 2009 there was no iPad, no ultrabooks, no LTE, no Netflix original series.  Palm was still attempting a comeback, Motorola was still separate from Google, Android was a nascent platform, Nokia still had Symbian, Steve Jobs was still alive.  Little changes have amount to vast revolutions throughout the tech industry.

In that time we have accomplished great things.  We have overthrown governments through the power of social networking.  We have built new devices that enrich our lives.  We have redefined the human condition – evolved into interconnected beings with access to unlimited knowledge and data through our devices.  We defeated SOPA; we stopped AT&T from merging with T-Mobile; we brought broadband to the masses; we overcame a brutal recession and are bringing the soul of American manufacturing back home.  The world around us has changed so drastically in this time and I have been honored to have even a meager audience for my contribution.

There will always be more to do.  I’m still young (though my family has said that I was born an old man), in moderately good health and see a lot more ahead of me.  This is not an “end”, nor is it a beginning.  It’s just a change that I have been working towards for quite sometime.  I’m not gone, I’m not done, I’ve simply grown a little older, a little wiser, and a little more realistic about the state of things around me and the nature of my work.  This blog was never supposed to be a defining feature or activity.  Yet it was an achievement that I stood on perhaps too long.  It was a feedback loop.  A place where I went to write the kind of things that I wanted to read.  Sometimes I hit upon a universal truth, but more often it was a jargon-filled rant that placated the voice in my head wanting to be externalized (my outer monologue I often fall into not withstanding).

As much as I miss writing, it is not the only path.  You can still follow my work from a behind-the-scenes position by visiting the GWHatchet, the website of my college newspaper at the George Washington University.  For the last few months I have been running the website as the Senior Web Producer alongside a dynamite team of developers, designers and co-producers.  It has been a fast-paced, stressful job that I love every minute of, even the parts I hate.  The Hatchet team has a fine tradition of excellent reporting that I love contributing to.  Plus it’s nice to stretch the more technical side of my skills with web languages (anyone know a PHP tutor?).

As for Digital Gravitas, well it’s not going anywhere (I paid for the damn hosting, I might as well use it).  Digital Gravitas will morph into a personal website and act as an online portfolio of my work and clips.  If you are looking for a writer, a web producer, or a videographer, shoot me a line.  And one in a while, I might take the time to still write a blog post.  After all, you can’t keep a good rant down for long.

Here’s to the future, because it’s already here.

– G

iphone-5c-colors

Apple announces the 2013 iPhone line

Tim Cook may have said he was “doubling down” on secrecy for Apple, but that didn’t stop the world from knowing the existence of the iPhone 5c and 5s months before their unveiling today.  Yes, the rumors were true: Apple has replaced the iPhone 5 with a plastic, colorful model that is meant to be sold at a lower price and appeal to a younger market.  This comes out along the new flagship, which attempts to bring internal innovation to an otherwise visually identical phone.  Despite the fact that we knew this was an “S Year” and a poorly kept secret at that, Apple gave a good showing.  Like always, their presentation was just good enough to keep me excited.  So what did they show? (more…)

nexusae0_motosmall

The Moto X: something better? New? Or just different?

nexusae0_motosmallThe Moto X. It’s a badly kept secret. It’s a phone we knew about for months, first by name, then by specs and then its actual design. Even the ethos behind the phone: being built-in the United States and customizable from the factory — was a known fact weeks ago.

The Moto X. It’s Google’s coup de grâce. The culmination of years of planning from Google’s software team with its acquired hardware team from Motorola as the company continues to reign in rampant Android fragmentation. It is the rebirth of an American standard in consumer electronics. Locally built, customizable and a great step forward for Android.

The Moto X. It’s a lot of things. A phone that is a combination of high-end construction and midrange parts. It is an attempt for Google to justify its purchase of Motorola, a once great leader in mobile phones. A phone that at once settles neatly into the midrange phone market and at the same time tries to shatter our preconceived notions about what a premium smartphone should be. A lot rides on this phone: the reputation of Motorola; Google’s attempt to repopulate the market with stock Android devices; and most importantly 2000 manufacturers’ jobs in Texas. (more…)

Google announces new Nexus, Android and more

Not to be outdone by Apple’s impressive showing at WWDC, Google this morning showed that they still have a lot more to bring to the table for their mobile platform.  The press event was nothing major, but does give Google a more impressive line up leading into what is sure to be a very intense and competitive fall.

To combat the increasingly competitive sub-7-inch tablet market, Google has announced a revised version of the Nexus 7 tablet.  The Nexus 7 has been a favorite of the sub-$200 tablet market, bringing good build quality, a high quality screen, fast internals and most importantly: stock Android.  Google’s follow-up does little to change the external design.  It’s still a black plastic rectangle, but it looks good and has signs of strong build quality.  It’s also thinner and lighter than the previous model, obviously.  A 5Mp rear camera has been added for marginal utility.  Powering the device is a quad-core Qualcomm Snapdragon S4 Pro, the same chip sitting in last year’s Nexus 4 smart phone.  It’s a very impressive jump from the old model’s Tegra 3 processor.  2GB of RAM sits with the chip as well as 16 or 32GB of storage.  All of this powers a new full HD 1080p display, which gives the Nexus 7 pixel density of 323.  The Nexus 7 also adds support for LTE on AT&T, T-Mobile and Verizon’s networks.  The Nexus 7 will be available online starting at $229, that’s $100 less than the lower resolution and less powerful iPad Mini.

The Nexus 7 runs the new Android 4.3, still called Jelly Bean and more of a minor update than previous .x releases.  4.3 brings a few nice improvements like multi-user profiles, allowing a device to be shared by a family with different apps and data apparent to each user.  It also has parental controls for the little ones.  Under the hood, Android now supports Bluetooth 4.0 + Low Energy, which is a major requirement for some accessories soon to arrive for mobile devices.  Also under the hood is support for OpenGL ES 3.0, which adds a lot of new shader effects for higher quality 3D graphics, an area that Google wants to improve on as games become an even larger selling point for iOS.  Android 4.3 also allows for 1080p streaming from services like Netflix, an important change with more high-resolution devices coming out.

Lastly, Google showed off a cool little gizmo called Chromecast.  It’s little more than a dongle that plugs into your TV’s HDMI port and connects to your home wifi.  It allows you to stream content from any mobile device, tablet or smartphone, iOS or Android, directly to your TV.  It’s not a full media box, but rather takes advantage of the increasing popularity of technology like MHL, Miracast, WiDi and AirPlay.  For $35 it looks like a very useful little add-on to a media center.

Briefly: Microsoft reverses always-on requirements

The shit storm of criticism surrounding Microsoft’s always-on requirements and anti-used game policy dealt a stunning blow to the company’s PR image.  Sony was able to come on stage at E3 at declare that their used-game policy would be the same as on the PS3.  The announcement lead to uproarious applause as if the company had announced something truly revolutionary.

Yesterday, Microsoft reneged on their earlier plans.  Stating that they had listened to the “feedback” of gamers, Microsoft has announced that the Xbox One will not require a 24-hour internet check up after its initial set up.  Users can use their consoles without an internet connection to play single player games.  Hallelujah.

Microsoft has also said that physically bought games will no longer be locked into any form of DRM.  In the company’s words: “Trade-in, lend, resell, gift, and rent disk based games just like you do today. There will be no limitations to using and sharing games, it will work just as it does today on Xbox 360.”  Hallelujah hallelujah.

Of course, as the company gives, it also takes away.  Because an internet connection is no longer required, disc-based games will have to have their optical disk in the drive while playing, even though the game automatically installs to the hard drive.  It’s a small inconvenience, but what did you expect when you buy a disk game?  Games purchased from the online store can still be accessed from the cloud and on other consoles, but disc based games have to be physically brought over…. just like today.  The family sharing program, which allowed one console to have its access permissions shared between up to 10 different accounts, has been terminated.  But it’s really not a big deal.

With this, Microsoft has taken away a major caveat to their next generation console.  However, it remains $100 more expensive and less powerful than its competitor.  Software remains the key.

WWDC 2013: Apple Returns in Fashion

Oh WWDC…. there is always so much riding on you.  It seems like every year Apple comes into its yearly developer conference with some kind of crisis on their hands.  How will they survive after Steve Jobs?  How do they improve on iOS 5?  Can they revive the Mac platform?  This year though was big.  Apple’s stock has been falling.  People are unsatisfied with iOS.  The company needed some changes in their platforms.  Did they deliver.

In a word: yes

In many many more words….. well read on. (more…)

Xbox One’s online stipulations revealed: it’s bad

courtesy of r/gamingOne of the great questions when Microsoft announced the Xbox One was the question about its online requirements.  Referred to as an “always-on” console, consumers wondered how strict the company would be about keeping the console connected.  What implications would it have for playing games and the sale of used games?  Microsoft cleared some of that up finally, and it’s not great.

The Xbox One must be connected to the internet once every 24 hours.  If it goes more than a day without a connection, DRM disables the console’s ability to play games.  Even if you are playing a single player game, this is required.  If you have a good internet connection, this is not a hard requirement to meet, but many customers still buy their consoles without any intention of connecting it to the internet.  They’re not into online gaming, they just want a good gaming experience.  I have many friends who keep their console’s offline either because they have no interest, or because they cannot afford broadband connections.

At my university, console owners cannot connect to the internet because we use 802.1X networks, which neither Microsoft, Sony or Nintendo support on their machines.  There are… solutions, but these are not institution endorsed.  In a school of 20,000, this makes the Xbox One essentially an impossibility without some effort.  It turns a significant percentage of the possible market into dissuaded consumers.

Now yes, a very large majority of Xbox users in the past have had their console connected to the internet.  Microsoft found that massive amounts of its customers use the Xbox 360 for Netflix and other video services.  They surely want to capitalize on that but those consumers will connect regardless.  Forcing everyone else to do the same is foolish.  The company stated for those without broadband connections can use mobile broadband from cellular networks, but if you know the costs of hotspots and data caps, that is hardly an ideal selection.

The used game market has been a hallmark of gaming since the days of the Atari.  It allowed gamers to save money on old titles, friends to trade their experiences, and for game stores to create a secondary market based on value.  Publishers were never happy about it though, since the used game market doesn’t bring them profits.  Game stores that sell and trade in used games keep the profits themselves.

Microsoft seems to have buckled to publisher pressure.  While Xbox One games purchased on a disk can be traded in at stores, the game is still linked to your account.  You will have to go through an as-of-yet unexplained process to de-register the game from your Live account so that it can be given to another user.  This process is free, but can only be done a limited amount of times: once it seems.  You can give a game to a friend and transfer the license, but only if they’ve been a Live friend for more than a month.  This can only be done once.

The process is arbitrary in its restrictions.  It makes temporary lending of games impossible between friends.  You can bring a game over and download your profile to play on their console, but if you want to even give a friend a taste of the experience, you’re SOL.

Game rentals, by the way, are gone.  Completely.  Microsoft says they are not supporting that at launch but will “explore future options.”  I’d like to give a heart-felt apology to GameFly ahead of time, because their model is essentially done.

Third party games?  They are opt-in to these restrictions.  If they want, they can deny any resale or license transfer.  Seeing as how that forces everyone to buy their own copy, and with the online verification DRM, they stand to be making more money.  But they are killing a major way that gamers educate themselves to make purchases: game discovery.

Trading games between players is part of the social experience of console gaming.  Friends swap games overtime, sharing in the experiences of different titles.  You discover games this way, learn about the types of games you like and then become a more informed purchaser.  This is something that cannot be replicated by a demo. It is the experience of a friend bringing over their copy of a game, letting you start your own file and having them guide through the game.  This social experience is community building as well as a better way to learn about the titles you want.  I never would have discovered the addicting world of Borderlands or Far Cry 3 or Dishonored and literally dozens of other games if it was not for this ritual.

For that, the Xbox One is a major let down.  A lack of backwards compatibility compounds the issue, meaning that the old mode of playing games is long gone.  I weep for the simple days when a game console was about just playing the software and not letting any extortionist policies control our experience.

Plus, the Kinect has to be on at all times, and given the news today that Microsoft is participating in the largest domestic surveillance in U.S History, I don’t think I want a camera on me at all times.

Microsoft: you must reverse this policy.  It will hurt your image.  It will hurt sales.  It will ruin the community that is 40 years in the making.

Everything you need to know about Haswell

Intel-Haswell-Wafer-Shot

Another year, another processor release from Intel.  Since 2006, Intel has introduced a yearly refresh cycle they refer to as “tick-tock.”  One year they release a “tock”, a major micro-architecture enhancement that brings significant changes.  The next year they release a “tick”, which shrinks the design to a smaller process — bringing cooler operation, faster speeds and minor changes to perfect the architecture.  Ivy Bridge, the 2012 release, was a “tick” of Sandy Bridge, brought down to 22nm.  Now Haswell, the next tock, launches this week.

Haswell redefines the nature of Intel’s market from here on out.  For years Intel has been marching towards better performance per watt and graphics without making those specs the most important features.  Now everything changes.  If you are looking for a laptop this year, make sure it has a Haswell processor.  Here are the things Haswell brings to the table (I’ve already gone over the architecture in-depth, so this is more about the end-consumer features). (more…)

Examining the Xbox One

One console to rule them all.  In order to accomplish that goal, Microsoft unveiled the successor to its popular Xbox 360 console: the Xbox One.  The name symbolizes the broad unification Microsoft’s next console brings to the TV.  Game consoles are now increasingly about things other than games.  The Xbox One is Microsoft’s choice to build off the shoulders of the 360’s increasingly potent entertainment options and create a true one-stop shop for the living room.

The battle for your living room is now in full swing.  Sony laid down its cards for the eighth generation of consoles, now it’s Microsoft’s turn.

(more…)

Google I/O 2013: Refinement, Enhancement, Unification

screen-shot-2013-03-12-at-1.23.16-pm-1363119959

Google’s yearly developer conference has never had the same fanfare as Apple’s WWDC, but that does not mean it is any less important.  Since the rise of Android, Google has evolved from a simple search company into a mobile powerhouse and back again into an all-encompassing internet services giant.  CEO Larry Page took the reins back from Eric Schmidt with deft precision, allowing the company to become more beautiful in its presentation and more grand in its desires.

Google I/O this year focused more heavily on development tools, but there are some significant changes Google announced that consumers will enjoy all the same. (more…)

Intel Announces the Next Generation Atom Processor

In 2008, Intel clearly thought it had created something unique with the Intel Atom processor.  It was a chip not meant for speed, but efficiency.  Intel would use the Atom to carve out a whole new segment of personal computing.  The company called them Mobile Internet Devices, Ultra Mobile PCs, netbooks.  It didn’t quite work out. (more…)

The Nintendo Wii U: Too Little Too Soon?

wiiuwrapupalltech620pxwmedNintendo cannot seem to catch a break.  The company announced a second annual operating loss – a far cry from the runaway profits the company saw during the Wii’s heyday.  The Wii U has seen sales plunge after a modestly successful launch.  It sold just 57,000 consoles in January and overall sales are at 3.45 Million.  It had sold about 3 Million by the end of last year.  Now, the company has announced that it is going to pass on a major keynote during this year’s E3 trade show.  Instead, they’ll just have software at the show without any pomp or circumstance.

The company is in trouble from a financial and operational standpoint.  The Wii U is not helping. (more…)

What a month it has been

My last post on this blog was March 26th, one month ago.  For that hiatus I have few excuses.  There are few things I enjoy more than the opportunity to write in my haphazard fashion, ruthlessly examining the latest in tech and internet culture.  The month of April was a powerhouse month, with plenty of device launches and changes in our industry.  Yet I was silent.  For that, let me apologize.  But rest assured, I have remained quite busy.

The last time I gave a general status update on my life it was a reflection on the tumultuous realizations I had to make about my career and my passions.  This last year has been an affirmation of those choices and a refinement of my skills.  It has been an intense time because I have been tremendously hands-on: learning and refining skills that I will need to make this blog better and become a capable writer.

Last semester, I had the joy to learn under Professor Kerric Harvey of the George Washington University.  I’ve already mentioned her before but I must continue to laud my appreciation for her educating.  Her lectures united the world of technology with the greater social impact of media – it was a fusion of the two things I know best: nitty-gritty technical information, and the greater role of the media in our society.  With her instruction, I was able to appreciate this industry and my Journalism major on a macro-level: a vast, connected web of wires, signals and messages that boil down to people.  Actual living people.  This humanist view of the cold realm of technology greatly impacted me and has helped me to understand the tremendous responsibility I have as someone learning the journalist ethic.

That was fall.  In the spring I have taken this mission into rapid skill development.  An advanced news writing class taught me that my news copy is pretty awful (thank God this isn’t AP style) but I can get better.  I have learned HTML, CSS, JavaScript, and JQuery in my Online Journalism Workshop (thank you to CodeAcademy) as well as working on my ability to bring together multimedia to tell a story.  I’ve also put together a documentary for a class.  While it was a serious struggle, my skills behind the camera and in editing are a lot stronger.

I also joined my college newspaper.  It has taken a few years of procrastination but I am now the Senior Web Producer for The GW Hatchet, an award-winning paper that I am incredibly honored to be working with for its 110th volume.  It’s a steep learning curve as I begin to bring my WordPress skill, content management and multimedia chops to the next level, but I’m always excited for the challenge.

I have a lot on my plate but I find myself truly growing.  There was a time when this blog was really my only extra-curricular activity, but these days it’s something I reluctantly have to put off.  I love writing and if anything all this multimedia stuff has made me realize how much I love it.  I will always take the challenge of describing a scene in words over just showing a video.

I desperately miss talk and writing about tech.  I will try my hardest to keep writing and posting.  I always say that more content is coming, so I won’t promise that.  I will say that I haven’t forgotten this place where I write my heart out.  Let it become whatever it wants, but I’m still here.

1 2 3 25
%d bloggers like this: