this space intentionally left blank.... oops!
21 stories
·
0 followers

For 10 Years D&D Suffered From an Unplayable Initiative System. Blame the Game’s Wargaming Roots

1 Share

While every version of Dungeons & Dragons has a rule for who goes first in a fight, no other rule shows as much of the game’s evolution from what the original books call rules for “wargames campaigns” into what the latest Player’s Handbook calls a roleplaying game about storytelling.

Before you old grognards rush to the comments to correct my opening line, technically the original books lacked any way to decide who goes first. For that rule, co-creators Gary Gygax and Dave Arneson supposed gamers would refer to Gary’s earlier Chainmail miniatures rules. In practice, players rarely saw those old rules. The way to play D&D spread gamer-to-gamer from Dave and Gary’s local groups and from the conventions they attended. D&D campaigns originally ran by word-of-mouth and house rules.

Gygax waited five years to present an initiative system in the Advanced Dungeons & Dragons Dungeon Master’s Guide (1979). Two things made those official rules terrible.

  • Nobody understood the system.

  • Any reasonable interpretation of the system proved too slow and complicated for play.

Some grognards insist they played the Advanced Dungeons & Dragons initiative system by the book. No they didn’t. Grognardia blogger James Maliszewski writes, “Initiative in AD&D, particularly when combined with the equally obscure rules regarding surprise, was one of those areas where, in my experience, most players back in the day simply ignored the official rules and adopted a variety of house rules. I know I did.”

Not even Gygax played with all his exceptions and complications. “We used only initiative [rolls] and casting times for determination of who went first in a round. The rest was generally ignored. We played to have fun, and in the throws of a hot melee, rules were mostly forgotten.”

With Advanced Dungeons & Dragons, the D&D story grows complicated, because original or basic D&D soldiered on with workable initiative systems. My next tale will circle back to D&D, but this one focuses on AD&D, the game Gygax treated as his own. (See Basic and Advanced—the time Dungeons & Dragons split into two games.)

Some of the blame for AD&D’s terrible initiative system falls back on Chainmail and Gygax’s love for its wargaming legacy.

Chainmail lets players enact battles with toy soldiers typically representing 20 fighters. The rules suggest playing on a tabletop covered in sand sculpted into hills and valleys. In Chainmail each turn represents about a minute, long enough for infantry to charge through a volley of arrows and cut down a group of archers. A clash of arms might start and resolve in the same turn. At that scale, who strikes first typically amounts to who strikes from farthest away, so archers attack, then soldiers with polearms, and finally sword swingers. Beyond that, a high roll on a die settled who moved first.

In Advanced Dungeons & Dragons, the 1-minute turns from Chainmail became 1-minute melee rounds. Such long turns made sense for a wargame that filled one turn with a decisive clash of arms between groups of 20 soldiers, but less sense for single characters trading blows.

Even though most D&D players imagined brief turns with just enough time to attack and dodge, Gygax stayed loyal to Chainmail’s long turns. In the Dungeon Master’s Guide (1979), Gygax defended the time scale. “The system assumes much activity during the course of each round. During a one-minute melee round many attacks are made, but some are mere feints, while some are blocked or parried.” Gygax cited the epic sword duel that ended The Adventures of Robin Hood (1938) as his model for AD&D’s lengthy rounds. He never explained why archers only managed a shot or two per minute.

Broadly, Advanced Dungeons & Dragons held to Chainmail’s system for deciding who goes first. Gygax also chose an option from the old wargame where players declared their actions before a round, and then had to stick to plan as best they could. “If you are a stickler, you may require all participants to write their actions on paper.”

Why would Gygax insist on such cumbersome declarations?

In a D&D round, every character and creature acts in the same few seconds, but to resolve the actions we divide that mayhem into turns. This compromise knots time in ridiculous ways. For example, with fifth edition’s 6-second rounds, one character can end their 6-second turn next to a character about to start their turn and therefor 6 seconds in the past. If they pass a relay baton, the baton jumps 6 seconds back in time. If enough characters share the same 6 seconds running with the baton, the object outraces a jet. Now expand that absurdity across AD&D’s 1-minute round.

Years before D&D, wargamers like Gygax had wrestled with such problems. They couldn’t resolve all actions simultaneously, but players could choose actions at once. Declaring plans in advance, and then letting a referee sort out the chaos yielded some of the real uncertainty of an actual battle. Wargamers loved that. Plus, no referee would let players declare that they would start their turn by taking a relay baton from someone currently across the room.

Especially when players chose to pretend that a turn took about 10 seconds, the Chainmail system for initiative worked well enough. In basic D&D, turns really lasted 10 seconds, so no one needed to pretend. Many tables kept that system for AD&D.

But nobody played the advanced system as written. Blame that on a wargamer’s urge for precision. Despite spending paragraphs arguing for 1-minute rounds, Gygax seemed to realize that a minute represented a lot of fighting. So he split a round into 10 segments lasting as long as modern D&D’s 6-second rounds. Then he piled on intricate—sometimes contradictory—rules that determined when you acted based on weapon weights and lengths, spell casting times, surprise rolls, and so on. In an interview, Wizards of the Coast founder Peter Adkison observed, “The initiative and surprise rules with the weapon speed factors was incomprehensible.”

In a minute-long turn filled with feints, parries, and maneuvering, none of that precision made sense. On page 61, Gygax seemed to say as much. “Because of the relatively long period of time, weapon length and relative speed factors are not usually a consideration.” Then he wrote a system that considered everything.

Some of the blame for this baroque system may rest on the wargaming hobby’s spirit of collaboration.

Even before D&D, Gygax had proved a zealous collaborator on wargames. Aside from teaming with other designers, he wrote a flood of articles proposing variants and additions to existing games. In the early years of D&D, Gygax brought the same spirit. He published rules and ideas from the gamers in his circle, and figured that players could use what suited their game. In the Blackmoor supplement, he wrote, “All of it is, of course, optional, for the premise of the whole game system is flexibility and personalization within the broad framework of the rules.”

I doubt all the rules filigree in AD&D came from Gygax. At his table, he ignored rules for things like weapon speed factors. Still, Gygax published such ideas from friends and fellow gamers. For example, he disliked psionics, but he bowed to his friends and included the system in AD&D. (See Gary Gygax Loved Science Fantasy, So Why Did He Want Psionics Out of D&D?.)

Weapon speed factors fit AD&D as badly as anything. In theory, a fighter could swing a lighter weapon like a dagger more quickly. Did this speed enable extra attacks? Not usually. Instead, light weapons could strike first. But that contradicted Chainmail’s observation that a fighter with a spear had to miss before an attacker with a dagger could come close enough to attack. Gygax patched that by telling players to skip the usual initiative rules after a charge.

AD&D’s initiative system resembles a jumble of ideas cobbled together in a rush to get a long-delayed Dungeon Master’s Guide to press. The system piled complexities, and then exceptions, and still failed to add realism. In the end, AD&D owed some success to the way D&D’s haphazard rules trained players to ignore any text that missed the mark.

In creating D&D, Dave Arneson and Gary Gygax faced a unique challenge because no one had designed a roleplaying game before. The designers of every roleplaying game to follow D&D copied much of the original’s work. Without another model, Gygax relied on the design tools from wargames. His initiative system may be gone, but ultimately Gary’s finest and most lasting contribution to D&D came from the lore he created for spells, monsters, and especially adventures.

Next: Part 2: “It’s probably so different that even if it’s better, people would not like it.”

Read the whole story
trent
1645 days ago
reply
Bethel, CT
Share this story
Delete

True experts don’t think they are.

2 Shares
Share:DiggStumbleUpondel.icio.usFacebookTwitterGoogle Bookmarks

The post True experts don’t think they are. appeared first on Indexed.

Read the whole story
trent
1941 days ago
reply
Bethel, CT
Share this story
Delete

How to Build a Low-tech Website?

1 Comment and 4 Shares

Low-tech Magazine was born in 2007 and has seen minimal changes ever since. Because a website redesign was long overdue — and because we try to practice what we preach — we decided to build a low-tech, self-hosted, and solar-powered version of Low-tech Magazine. The new blog is designed to radically reduce the energy use associated with accessing our content.

Solar-powered-server-7

First prototype of the solar powered server that runs the new website.


Read this article on the solar powered website.


Why a Low-tech Website?

We were told that the Internet would “dematerialise” society and decrease energy use. Contrary to this projection, it has become a large and rapidly growing consumer of energy itself. According to the latest estimates, the entire network already consumes 10% of global electricity production, with data traffic doubling roughly every two years.

In order to offset the negative consequences associated with high energy consumption, renewable energy has been proposed as a means to lower emissions from powering data centers. For example, Greenpeace's yearly ClickClean report ranks major Internet companies based on their use of renewable power sources.

However, running data centers on renewable power sources is not enough to address the growing energy use of the Internet. To start with, the Internet already uses three times more energy than all wind and solar power sources worldwide can provide. Furthermore,  manufacturing, and regularly replacing, renewable power plants also requires energy, meaning that if data traffic keeps growing, so will the use of fossil fuels. 

Running data centers on renewable power sources is not enough to address the growing energy use of the Internet.

Finally, solar and wind power are not always available, which means that an Internet running on renewable power sources would require infrastructure for energy storage and/or transmission that is also dependent on fossil fuels for its manufacture and replacement. Powering websites with renewable energy is not a bad idea, however the trend towards growing energy use must also be addressed.

"Fatter" Websites

To start with, content is becoming increasingly resource-intensive. This has a lot to do with the growing importance of video, but a similar trend can be observed among websites.

The size of the average web page (defined as the average page size of the 500,000 most popular domains) increased from 0.45 megabytes (MB) in 2010 to 1.7 megabytes in June 2018. For mobile websites, the average “page weight” rose tenfold from 0.15 MB in 2011 to 1.6 MB in 2018. Using different measurement methods, other sources report average page sizes of up to 2.9 MB in 2018.

The growth in data traffic surpasses the advances in energy efficiency (the energy required to transfer 1 megabyte of data over the Internet), resulting in more and more energy use. “Heavier” or “larger” websites not only increase energy use in the network infrastructure, but they also shorten the lifetime of computers — larger websites require more powerful computers to access them. This means that more computers need to be manufactured, which is a very energy-intensive process.

Being always online doesn't combine well with renewable energy sources such as wind and solar power, which are not always available.

A second reason for growing Internet energy consumption is that we spend more and more time on-line. Before the arrival of portable computing devices and wireless network access, we were only connected to the network when we had access to a desktop computer in the office, at home, or in the library. We now live in a world in which no matter where we are, we are always on-line, including, at times, via more than one device simultaneously.

“Always-on” Internet access is accompanied by a cloud computing model – allowing more energy efficient user devices at the expense of increased energy use in data centers. Increasingly, activities that could perfectly happen off-line – such as writing a document, filling in a spreadsheet, or storing data – are now requiring continuous network access. This does not combine well with renewable energy sources such as wind and solar power, which are not always available.

Low-tech Web Design

Our new web design addresses both these issues. Thanks to a low-tech web design, we managed to decrease the average page size of the blog by a factor of five compared to the old design – all while making the website visually more attractive (and mobile-friendly). Secondly, our new website runs 100% on solar power, not just in words, but in reality: it has its own energy storage and will go off-line during longer periods of cloudy weather.

The Internet is not an autonomous being. Its growing energy use is the consequence of actual decisions made by software developers, web designers, marketing departments, publishers and internet users. With a lightweight, off-the-grid solar-powered website, we want to show that other decisions can be made.

With 36 of roughly 100 articles now online, the average page weight on the solar powered website is roughly five times below that of the previous design.

To start with, the new website design reverses the trend towards increasingly larger page sizes. With 36 of roughly 100 articles now online, the average page weight on the solar powered website is 0.77 MB — roughly five times below that of the previous design, and less than half the average page size of the 500,000 most popular blogs in June 2018. 


A web page speed test from the old and the new Low-tech Magazine. Page size has decreased more than sixfold, number of requests has decreased fivefold, and download speed has increased tenfold. Note that we did not design the website for speed, but for low energy use. It would be faster still if the server would be placed in a data center and/or in a more central location in the Internet infrastructure.

9801a71c-bdae-4732-9ad7-b45d26897a32

0103010d-26e1-48e6-a7c1-21d0dd355b1a

Source: Pingdom.


Static Site

One of the fundamental choices we made was to build a static website. Most of today’s websites use server side programming languages that generate the website on the fly by querying a database. This means that every time someone visits a web page, it is generated on demand.

On the other hand, a static website is generated once and exists as a simple set of documents on the server’s hard disc. It's always there -- not just when someone visits the page. Static websites are thus based on file storage whereas dynamic websites depend on recurrent computation. Static websites consequently require less processing power and thus less energy.

The choice for a static site enables the possibility of serving the site in an economic manner from our home office in Barcelona. Doing the same with a database-driven website would be nearly impossible, because it would require too much energy. It would also be a big security risk. Although a web server with a static site can be hacked, there are significantly less attack routes and the damage is more easily repaired.

Screenshot-solar-powered-server

Dithered Images

The main challenge was to reduce page size without making the website less attractive. Because images take up most of the bandwidth, it would be easy to obtain very small page sizes and lower energy use by eliminating images, reducing their number, or making them much smaller. However, visuals are an important part of Low-tech Magazine’s appeal, and the website would not be the same without them.

By dithering, we can make images ten times less resource-intensive, even though they are displayed much larger than on the old website.

Instead, we chose to apply an obsolete image compression technique called “dithering”. The number of colours in an image, combined with its file format and resolution, contributes to the size of an image. Thus, instead of using full-colour high-resolution images, we chose to convert all images to black and white, with four levels of grey in-between.

These black-and-white images are then coloured according to the pertaining content category via the browser’s native image manipulation capacities. Compressed through this dithering plugin, images featured in the articles add much less load to the content: compared to the old website, the images are roughly ten times less resource-intensive.

Default typeface / No logo

All resources loaded, including typefaces and logos, are an additional request to the server, requiring storage space and energy use. Therefore, our new website does not load a custom typeface and removes the font-family declaration, meaning that visitors will see the default typeface of their browser. 

Screenshot-solar-powered-website

We use a similar approach for the logo. In fact, Low-tech Magazine never had a real logo, just a banner image of a spear held as a low-tech weapon against prevailing high-tech claims.

Instead of a designed logotype, which would require the production and distribution of custom typefaces and imagery, Low-tech Magazine’s new identity consists of a single typographic move: to use the left-facing arrow in place of the hypen in the blog’s name: LOW←TECH MAGAZINE.

No Third-Party Tracking, No Advertising Services, No Cookies

Web analysis software such as Google Analytics records what happens on a website — which pages are most viewed, where visitors come from, and so on. These services are popular because few people host their own website. However, exchanging these data between the server and the computer of the webmaster generates extra data traffic and thus energy use.

With a self-hosted server, we can make and view these measurements on the same machine: every web server generates logs of what happens on the computer. These (anonymous) logs are only viewed by us and are not used to profile visitors.

With a self-hosted server, there is no need for third-party tracking and cookies.

Low-tech Magazine has been running Google Adsense advertisements since the beginning in 2007. Although these are an important financial resource to maintain the blog, they have two important downsides. The first is energy use: advertising services raise data traffic and thus energy use.

Secondly, Google collects information from the blog’s visitors, which forces us to craft extensive privacy statements and cookie warnings — which also consume data, and annoy visitors. Therefore, we replace Adsense by other financing options (read more below). We use no cookies at all.

How often will the website be off-line?

Quite a few web hosting companies claim that their servers are running on renewable energy. However, even when they actually generate solar power on-site, and do not merely “offset” fossil fuel power use by planting trees or the like, their websites are always on-line.

This means that either they have a giant battery storage system on-site (which makes their power system unsustainable), or that they are relying on grid power when there is a shortage of solar power (which means that they do not really run on 100% solar power).

Screenshot-solar-panel-server

The 50W solar PV panel. On top of it is a 10W panel powering a lighting system.

In contrast, this website runs on an off-the-grid solar power system with its own energy storage, and will go off-line during longer periods of cloudy weather. Less than 100% reliability is essential for the sustainability of an off-the-grid solar system, because above a certain threshold the fossil fuel energy used for producing and replacing the batteries is higher than the fossil fuel energy saved by the solar panels.

How often the website will be off-line remains to be seen. The web server is now powered by a new 50 Wp solar panel and a two year old 12V 7Ah lead-acid battery. Because the solar panel is shaded during the morning, it receives direct sunlight for only 4 to 6 hours per day. Under optimal conditions, the solar panel thus generates 6 hours x 50 watt = 300 Wh of electricity.

The web server uses between 1 and 2.5 watts of power (depending on the number of visitors), meaning that it requires between 24 Wh and 60 Wh of electricity per day. Under optimal conditions, we should thus have sufficient energy to keep the web server running for 24 hours per day. Excess energy production can be used for household applications.

We expect to keep the website on-line during one or two days of bad weather, after which it will go off-line.

However, during cloudy days, especially in winter, daily energy production could be as low as 4 hours x 10 watts = 40 watt-hours per day, while the server requires beteen 24 and 60 Wh per day. The battery storage is roughly 40 Wh, taking into account 30% of charging and discharging losses and 33% depth-or-discharge (the solar charge controller shuts the system down when battery voltage drops to 12V).

Consequently, the solar powered server will remain on-line during one or two days of bad weather, but not for longer. However, these are estimations, and we may add a second 7 Ah battery in autumn if this is necessary. We aim for an "uptime" of 90%, meaning that the website will be off-line for an average of 35 days per year. 

Screenshot-batteries

First prototype with lead-acid battery (12V 7Ah) on the left, and Li-Po UPS battery (3,7V 6600mA) on the right. The lead-acid battery provides the bulk of the energy storage, while the Li-Po battery allows the server to shut down without damaging the hardware (it will be replaced by a much smaller Li-Po battery).

When is the best time to visit?

The accessibility of this website depends on the weather in Barcelona, Spain, where the solar-powered web server is located. To help visitors “plan” their visits to Low-tech Magazine, we provide them with several clues.

A battery meter provides crucial information because it may tell the visitor that the blog is about to go down -- or that it's "safe" to read it. The design features a background colour that indicates the capacity of the solar-charged battery that powers the website server. A decreasing height indicates that night has fallen or that the weather is bad.

Cloud-cover-spain

In addition to the battery level, other information about the website server is visible with a statistics dashboard. This includes contextual information of the server’s location: time, current sky conditions, upcoming forecast, and the duration since the server last shut down due to insufficient power.

Computer Hardware

SERVER. The website runs on an Olimex A20 computer. This machine has 2 Ghz of processing power, 1 GB of RAM, and 16 GB of storage. The server draws 1 - 2.5 watts of power.

INTERNET CONNECTION. The server is connected to a 100 MBps fibre internet connection. For now, the router is powered by grid electricity and requires 10 watts of power. We are investigating how to replace the energy-hungry router with a more efficient one that can be solar-powered, too.

SOLAR PV SYSTEM. The server runs on a 50 Wp solar panel and one 12V 7Ah lead-acid battery (energy storage capacity will be doubled at the end of this month). The system is managed by a 20A solar charge controller.

Solar-powered-server-in-living-room

What happens to the old website?

The solar powered Low-tech Magazine is a work in progress. For now, the grid-powered Low-tech Magazine remains on-line. Readers will be encouraged to visit the solar powered website if it is available. What happens later, is not yet clear. There are several possibilities, but much will depend on the experience with the solar powered server.

Until we decide how to integrate the old and the new website, making and reading comments will only be possible on the grid-powered Low-tech Magazine, which is still hosted at TypePad. If you want to send a comment related to the solar powered web server itself, you can do so by commenting on this page or by sending an e-mail to solar (at) lowtechmagazine (dot) com.

Can I help?

Yes, you can.

On the one hand, we're looking for ideas and feedback to further improve the website and reduce its energy use. We will document the project extensively so that others can build low-tech websites too.

On the other hand, we're hoping for people to support this project with a financial contribution. Advertising services, which have maintained Low-tech Magazine since its start in 2007, are not compatible with our lightweight web design. Therefore, we are searching for other ways to finance the website:

  1. We will soon offer print-on-demand copies of the blog. These publications will allow you to read Low-tech Magazine on paper, on the beach, in the sun, or whenever and where ever you want.

  2. You can support us through PayPalPatreon and LiberaPay.

  3. We remain open to advertisements, but these can only take the form of a static banner image that links to the website of the advertiser. We do not accept advertisers who are incompatible with our mission.

The solar powered server is a project by Kris De DeckerRoel Roscam Abbing, and Marie Otsuka.

solar.lowtechmagazine.com

Related article: How to build a low-tech internet.

Read the whole story
trent
2231 days ago
reply
Bethel, CT
Share this story
Delete
1 public comment
brico
2247 days ago
reply
Dithering while Rome burns. Seriously tho I love this.
Brooklyn, NY

Critical Path

1 Share


Why is John Kerry going down to Antarctica just a week after the election to discuss climate change and then you have energy beams coming out of Antarctica splitting hurricanes?
-- Owen Shroyer, Infowars


We are under attack.

America is under attack.

From weather weapons and energy beams wielded by John Kerry, if you believe Alex Jones.

It’s true. Quite literally. But not in the manner described by hysterical conspiracy theorists. And nothing so simple or as silly or as easily countered.

America is under attack. It’s true. We are under attack in a war that very few Americans, most especially including those in charge, or those pushing conspiracies, understand. Worse, it’s not just that we don’t understand this war, it’s that so very many Americans are even capable of understanding this conflict.

And thus America is ill-equipped to fight off this assault.

image

A study published today in Research and Practice, describes how Russian operators have weaponized health communication.

Specifically the authors, David A. Broniatowski, PhD; Amelia M. Jamison, MAA, MPH; SiHua Qi, SM; Lulwah AlKulaib, SM; Tao Chen, PhD; Adrian Benton, MS; Sandra C. Quinn, PhD; and Mark Dredze, PhD, show how information warfare pushed by social media automation (i.e. "bots) and directed by troll accounts, can use wedge issues like the conspiracy theories surrounding vaccinations to sow measurable chaos and division in the American population.

I have some significant experience in this field.

This is what I used to do for a living.

In the beginning I was a technical cryptologist. A codebreaker specializing in electronic signals. Back in those ancient days of the early 1980s, the world was a very different place.  As now, America faced myriad threats, but most of those took a backseat to the Soviet Union.

Over the years, the politics and the enemies changed, the tools evolved, but for most of the 20th Century intelligence work remained pretty much the same.

All that changed in the last 20 years.

And the most profound change in intelligence work is volume. Volume of information.

You see, back when I first joined the Intelligence world, bandwidth was a scarce commodity. Communication channels from out on the pointy end of the stick where I was back to certain three-letter agencies in Washington – and then on to the decision-makers and perhaps the public – was limited.

Extremely limited.

Limited in a fashion most of you who are reading this from your smartphones are unlikely to understand.

Let me give you an example: We often reported via encrypted satellite text-based messaging systems that operated at 75 baud. 

No, that’s not a typo. 75 baud.

What’s a baud?

Exactly.

Unless you’ve worked with old electronic communications systems, you’re likely unfamiliar with that term, baud. It’s a measurement of information rate, a unit of transmission speed that describes signal state changes per second in an analog shift-key teletype system.

Huh?

Right. Huh indeed.

Let me put it in modern terms: a baud is equivalent to one bit per second.

You, of course, looking at your 4G smartphone, you are much more familiar with megabits or even gigabits per second. Millions, billions, of bits of information per second. Every second.

But back in day, information moved at a much slower rate.

Depending on the character-set/encoding system used, it takes anywhere from 5 bits (Baudot code) to 21 bits (Unicode) to make one character, i.e. a letter or number or other special character such as a period or question mark. The very symbols you are reading right now. Back then, our systems generally used 8-bit character sets (ASCII). Meaning that it took eight state changes, eight bauds, to send one character. Now, if you’re running at 75 baud, each bit (the smallest unit of information in the system) is then 13.3 milliseconds in length, or about 13 one-thousandths of a second to transmit. Multiply that times eight and you find that it takes a little over a tenth of a second to transmit one character – and in practicality, longer, because we were pushing those bits through high-level encryption systems, and through multiple levels of bi-directional transmission error-checking.

Now, what all that technical gibberish means in practical terms is that sending data was slow.

Very slow.

In the amount it took to send one character back then, you could have reloaded this webpage on your smartphone half a dozen times over.  Or more.

In the last decade of the 20th Century, and the first two decades of this one, communication speed and the amount of data that we can send reliably – even from a satellite cell phone in the warzone, far out on the pointy end of the stick – has increased by several orders of magnitude, i.e. hundreds of thousands of times. In some cases, millions.

Volume.

Volume of information.

And that’s not necessarily a good thing. Not when it comes to quality and reliability of information.

Not when it comes to fact and truth.

You see, back in the day, every bit was precious. So, when we gathered information, sometimes at great risk to ourselves, that raw intelligence was examined on site by analysts, specialists in that particular target. If it was deemed worthy of further examination, then it was formatted into electronic reports. And those reports had a very specific structure, they were very lean, using only the characters necessary to relay the information and no more. Before the reports were transmitted, they were examined by a very experienced senior NCO – who was typically also an analyst.  Then, in many cases, the information was checked one final time by an officer. The report was sent up-echelon to a regional processing center, where it was again examined by a team of analysts and  combined with other information (a process that was in those days known as “fusion”), and then that report was examined at multiple levels and forwarded up the chain of command to one of those aforementioned three-letter agencies back in Washington, where it was combined with yet more information and turned into national intelligence assessments for those in the White House and Congress.

We had plenty of people, what we didn’t have was bandwidth and computing power.

So, it was imperative that every bit sent was as accurate and as reliable as possible, so as to make the absolute best use of our resources.

And the side effect of this painstaking process was that the final intelligence product was of very high quality and presented to the president by those were were very, very familiar with the targets and who specialized in explaining this information to politicians in a fashion they could understand.

Now, don’t get me wrong here, how that information was interpreted and used by politicians at the top end of the chain of command is a different matter – and likewise when that information was declassified (in some cases) and pushed out to the general public. And it wasn’t just the intelligence community, news organizations labored under those same technological restrictions and the same biased-interpretation by the politicians and the public. Which had similar impact on the information they presented and how it entered the public consciousness.

That said, the information that arrived at the consumer was often as accurate and as reliable as is humanly possible.

All of that changed with the advent of dramatically increased bandwidth and processing power.

Over the last few decades the information cycle has become highly compressed, increasingly so.

And as the volume of raw information increases exponentially, at the same time the ability of both the news media and the intelligence community to analyze it and filter out the noise has dramatically decreased.

It has become utterly impossible to examine each piece of information in the detail we once did.

And as such, it is utterly impossible to ensure the quality and reliable and accuracy of that information.

Now, here’s the important part, so pay attention: Our society, both the decision-makers who run it and the citizens who daily live in it, is habituated to having information processed, analyzed, and presented in a fashion where they can have reasonable confidence in that information.

And that information thus directly shapes our worldview.

Up until recently, the average politician, the average citizen, didn’t have to be an information analyst, didn’t have to have critical information processing skills, because the information system for the most part did that on the front end. The consumer very rarely received raw information about the world outside of their own immediate sphere of observation.

Almost everything you knew about the greater world was filtered through information processing systems by experts.

That is no longer true in any fashion.


And yet, we operate as if it still is.


You can see this most clearly in the older generation, many of which still believe that “they couldn’t print it if it wasn’t true.”

This is bad enough in the general population, but it is a disaster of unmeasurable scale when government, and society itself, begins to operate on this unstable foundation.

The massive increase in information volume means that all of us are daily bombarded with a firehose of raw information, unprocessed, unfiltered.

And the vast, vast majority of you are ill-equipped to handle this in any fashion.

Most of the world lacks the training, the tools, and the support to filter bad information from good, to determine the validity of intelligence. And so, increasingly, we live in a world of malleable reality, one where politicians and media personalities tell you with a straight face “truths are not truths” and “there are facts and alternative facts.”

This problem became millions of times worse with the advent of social media.

And this situation, this world of alternative facts and shifting truth and bottomless raw unfiltered information piped directly into the minds of the population without  error-checking and expert interpretation creates new and unique vulnerabilities that can be exploited on a massive, global, scale in a fashion that has never been done before.

Information warfare.

More powerful, more far-reaching, more scalable, more destructive to the very fabric of our society than any nuclear bomb.

This form of warfare is incredibly powerful, far more so than any other weapon – because it reaches directly into your mind and shapes how you see the world.

Information warfare is infinitely scalable, it can target a single individual, or the entire global population, it can target a single decision-maker, a government, a population, or alter the course of history.

For example: The president of this country watches a certain news/talk/infotainment show. Every day. Without fail. And that show, the information presented there, directly shapes how he sees the world. You can watch this happen daily in real-time. Those who control that show, has direct and immediate influence on the president, and thus on the country, and thus on a global scale. It is a astounding national security vulnerability. One our enemies are well, well aware of and one, a vulnerability that our own counter-intelligence people cannot plug due to the very nature of their own Commander-in-Chief.

This is unprecedented in our history.

Over time, Information Warfare has had many names and been implemented in many, many ways – sometimes hilariously unsuccessful, sometimes horrifyingly effective, often somewhere in between. Deception warfare, communications warfare, electronic warfare, psychological warfare, perception management, information operations, active measures, marketing, whatever you call it, this form of weaponized intelligence really came into its own with the advent of social media and the 24/7 news cycle. 

And unlike convention weapons, information warfare can be wielded by a handful of operators, working from a modest office with no more infrastructure than a smartphone and a social media account.

Weaponized information.

Active measures.

This was my specialty.  Over time, as the intelligence community changed, as technology evolved, my own career changed with it, I went from being a junior technician specializing in electronic signals to an information warfare officer, one of the first in my field to be specifically designated as such. And one of the first to go to war specifically as such. Now I'm not going to discuss the details of my own military career any further, because those specifics are still highly classified. Suffice it to say, this is a field with which I am intimately familiar. And one at which I was very, very good. 

And from that experience, I will tell you this:

An educated population trained from early age in critical thinking, whose worldview is based on fact, validated evidence, and science, is the single strongest defense – the only true defense -- against this form of assault.

But, we don’t live in that world.

We can’t put the genie back in the bottle. And we lack that defense, deliberately so. Because just as our own enemies benefit from an population incapable of critical thought, so do those who seek political power within our own nation.

A population skilled in critical thought is the best defense against information warfare waged by our enemies, but it is also the best defense against tyranny, against the corruption of political and religious power.

But, again, we don’t live in that world.

And as such, given the state of America, the anti-vaxxer conspiracy theory was an easy target.

It's not the only one, or the easiest one, or the wedge issue most vulnerable to manipulation, or the one most likely to be pushed from a low-grade irritant into a full-on pitched battle among the population and thus one that directly influences the decision-makers in charge of our government.

It is simply an easy target. One of many. Low hanging fruit. An obvious point of exploitation.

It’s not the conspiracy theory itself that is the point of vulnerability, it’s the conditions, the worldview, that lead to such persistently wrong-headed beliefs.

You see, it is religious nuts, the fanatical partisan, conspiracy theorists, the uneducated, the deliberately ignorant and the purposely contrarily obtuse who are the perfect targets for Information Warfare.


image


These are the perfect suckers, easily manipulated and turned into unwitting tools of the enemy.

All you have to do is tell them what they want to hear.

And in America’s case, this target is uniquely vulnerable, uniquely fertile, because they have been conditioned by centuries of first religious nonsense given equal footing with science and then decades of conspiracy theory “infotainment” media treated as fact for profit.

If you want to know how we got here, all you have to do is look for the “Infowars” bumper-sticker proudly and unashamedly displayed on the car ahead of you in traffic.

image

It’s not just Alex Jones.

Or Rush Limbaugh or Glenn Beck or Michael Savage and all the others who sell conspiracy theory as fact.

It’s not just Jerry Falwell and Ken Hamm and Joel Osteen and all the other holy joes who push their religious fraud as truth.

It’s not just the politicians who lie to you every day for their own profit and power.

It’s a population that utterly lacks the ability to process information in any reliable fashion, lacks intellectual rigor, lacks intellectual curiosity, and worse, lacks any desire to acquire such.


This is the population described by Orwell in his novel 1984.


Truth is not truth.

Understand something here, those on the other side, the operators working for Russia, they don't really care one way or the other if you vaccinate your kids.

Not really

Although an unvaccinated population is vulnerable to other kinds of warfare as well, and if campaigns like this one increase that vulnerability, well, then Russia gets more bang for its ruble and its biological weapons become just that more effective. In the military, we call this a force multiplier.


The goal is here division, to sow discord in the target population, start a fight in the target country and keep that fight going, break down unity, create distrust at all levels of the target society.


This particular point of attack is one of hundreds.

If you watch social media for this sort of thing, you very quickly see dozens of other points of vulnerability in the population. And if you go looking, and you know what to look for, you very quickly find evidence of similar manipulation on those fronts.

Information Warfare can, and is, used as a primary warfare area, as powerful or more so than any bomb. I've done it myself in combat. But when used in this manner, as the Russians are using it against us right now, it is a warfare support function. An enhancer. A force multiplier, one that makes other weapons, both kinetic and political, work better.

You don't need to hack election machines, if you can hack the voter.

You don't need to hack democracy if you can hack the citizen.

You don't need to physically destroy the United States if you can make Americans distrust the fundamental institutions of their own republic.

If you show that the election machines are vulnerable before the election and you do so in a manner that is purposely detectable – that you know will be detected and thus reported hysterically to the population by the target’s own media -- then you don't have to hack the actual elections themselves.

You only have to show that you can.

Couple that to amplification of voter disenfranchisement, enhanced by the aforementioned division, and you directly influence the population into believing that democracy cannot be trusted. That the fundamental fabric of the Republic is unsound. And the voters will stay home, or vote in a manner that creates division, or will not trust any results of the election and thus be prone to riot and protest and resistance against the resulting government.


You don’t have to destroy America, when Americans are willing to do it for you.


Now, the most effective countermeasure in this particular example should be obvious.

Secure the election.

By whatever means necessary secure the election, paper ballots, secure isolated non-networked machines, validated public audits, whatever methodology of validation and integrity is most provable to the population.

Instead of closing voting stations, open more. Get citizens to polls. Make it easier for the population to vote, not harder.

End by law those institutions which disenfranchise voters, i.e. gerrymandering, certain types of primaries, and so on.

In other words, do what is necessary to demonstrate to all citizens that the fundamental institution of the Republic is sound and that democracy can be trusted and that their vote counts.

Or course, the only way to do that is to actually make democracy trustworthy and make every vote count.


Instead, ironically, those in charge have done exactly the opposite.


Why?

Because this is the natural tendency of those in power.

And that tendency, that weakness of our republic, is precisely the vulnerability this line of attack is designed to exploit.

Russian information warfare didn't create this vulnerability, it simply takes advantage of it so long as we do nothing to counter it.

This particular attack, the one outlined in the study linked to above, is insignificant when looked at in isolation. It is simply a target of opportunity. One of many. But when looked at as part of the larger whole, it is an assault in a much larger and far more significant war. One that we are losing.

Russian doesn't have to destroy America with bombs and missiles.

All they have to do is make us weaker and weaker while they grow stronger.

All they have to do is exploit the vulnerabilities we give them.

All they have to do is take advantage of the deliberately ignorant and the gleefully stupid.

History will do the rest.

And there is no greater student of history than a Russian like Vladimir Putin.


Information is not knowledge.
-- Albert Einstein

Read the whole story
trent
2271 days ago
reply
Bethel, CT
Share this story
Delete

Merry snot season.

1 Share

card5063

Share and Enjoy:DiggStumbleUpondel.icio.usFacebookTwitterGoogle Bookmarks

The post Merry snot season. appeared first on Indexed.

Read the whole story
trent
2878 days ago
reply
Bethel, CT
Share this story
Delete

The Vegan Book of Permaculture

1 Comment

Graham Burnett has published a new book called the Vegan Book of Permaculture. This book combines ethical vegan recipes with permaculture principles for ecological living and gardening.

JPEG - 28.8 kb

"Long time permaculture practitioner and activist Graham Burnett has written a very practical guide to living lightly using permaculture design within the ethical constraints and opportunities of a vegan diet. Based on lived experience rather than ideology, the strong focus on food, complete with recipes, helps vegans and omnivores alike make better use of the diversity of plant based ingredients in cool temperate climates. For vegans wanting to reduce their ecological footprint, maintain nutritional balance and increase their autonomy and resilience in a rapidly changing world, this book is the ideal introduction to permaculture living and land use."

- David Holmgren, Co-originator of the Permaculture concept

More information and reviews are available on Graham's website.

Graham is also the author of numerous other publications, including Permaculture: A Beginner's Guide, which is available in French, Spanish and Croatian.

Read the whole story
trent
3555 days ago
reply
It's great to see a book combining these two important ideas.
Bethel, CT
Share this story
Delete
Next Page of Stories