[Novalug] Administrivia: don't use URL shorteners

Rich Kulawiec rsk@gsp.org
Wed Jan 14 08:00:52 EST 2015


On Mon, Jan 12, 2015 at 11:32:56AM -0600, Beartooth via Novalug wrote:
> 	Please define what you consider "baseline competence and a proper
> software tool set." Also, what about the extensions for Firefox that
> enable you to create your own (if that's what they really do).

[snip]

Tools first, then techniques:

I don't use Firefox et.al. most of the time; I use w3m and mutt.

(Unfortunately, a lot of novice web designers don't realize that the web
and indeed, the entire Internet, is based around text.  This yields web
sites which are unreadable/unusable in w3m, which would merely be annoying
*except* that -- to a decent first approximation -- "how your web site
looks to a text-only browser" is "how your web site looks to the blind".
Thus their self-indulgent exercises in ego gratification lead to bloated,
graphics-heavy, bandwidth-hungry, script-laden monstrosities when in fact
a simple page of text would serve the function of communication in a vastly
more efficient and effective manner.  A good example of this is Boing Boing,
which features highly interesting content but which is now hideously
bloated with...well, crap.  Lots of crap.  Lots and LOTS of crap.
The web designer behind it should be placed up against the wall with
the marketing division of the Sirius Cybernetics Corporation.)

Anyway, snark aside:

	w3m: http://w3m.sourceforge.net
	mutt: http://mutt.org

I highly recomend both: they're quite versatile and have very small
attack surfaces.  (Also, you can use them without taking your hands off
the keyboard, which is nice for anyone with good typing skills.)
If you find yourself having to visit some of the dark corners
of the 'net and/or having to manipulate email messages from
known-hostile sources, using these (preferably on OpenBSD) (preferably
on a non-Intel architecture) (preferably via an anonymizing proxy)
will do a lot to minimize your exposure.  They're also lightweight and
resource-frugal, and if you spend a lot of time using ssh in
terminal windows, extremely useful.

Another approach is to use wget or curl to pull the relevant page without
a browser, then run it through a script that strips markup and read it
with more/less.  For particularly dangerous and/or intrusive sites, this
seems like a reasonable way to defang the content before peering at it.

Yet another approach is to check the Wayback machine.  This has some
serious limitations, of course, but it does provide a way to access
some of the content at some sites without actually going anywhere near them.

For those times when Firefox is indicated, some combination of

	HTTPS Everywhere
	https://www.eff.org/https-everywhere

	NoScript
	https://addons.mozilla.org/en-US/firefox/addon/noscript/

	Adblock Edge
	https://addons.mozilla.org/en-US/firefox/addon/adblock-edge/

	Certificate Patrol
	https://addons.mozilla.org/en-US/firefox/addon/certificate-patrol/

	Calomel SSL Validation
	https://addons.mozilla.org/en-US/firefox/addon/calomel-ssl-validation/

	Privacy Badger
	https://addons.mozilla.org/en-US/firefox/addon/privacy-badger-firefox/

	Beef Taco - Targeted Advertising Cookie Opt-Out
	https://addons.mozilla.org/en-US/firefox/addon/beef-taco-targeted-advertising/

is useful.  It's unfortunate that all of these are add-ons: Mozilla
should stop screwing around with the UI -- which was just fine 17 revisions
ago -- and integrate the functionality of all of these in the browser.
Equally unfortunate is that all of these have their own interface, options,
etc. and so trying to configure them all in a way that makes sense is
tedious, at best.  I rather doubt that I've actually managed to get
every setting configured consistently/correctly across all of them.

That said, though, the first three are appropriate for everyone.  Using
NoScript does require a learning curve, but it's worth it.

Techniques:

Of course, in the browser itself, I've turned off Java, Flash and other
annoyances.  History/cookies/etc. get cleaned out frequently, including
just before/after viewing sites known to pose issues.  I use the DROP/EDROP
lists to stop myself from accidentally visiting (or in some cases,
resolving the addresses) of extremely dubious sites and then fall back
to w3m on a different system (see above).  I've used DNS RPZ to permanently
blacklist quite a few so-called "social networks" not merely because of
the enormous privacy risks but because they're spammers.  Same for all
networks in China.  And Korea.  And a few other countries.  And so on:
the idea of most of these is to stop me, in a moment of carelessness,
from doing something that I probably don't really want to do.

I don't follow any URL without looking at it.  A great many of them
have embedded tracking information or other excess/unwanted strings.
It's often easier to just go to the home page of the site named in
the URL and navigate from there.  Or to use DuckDuckGo to find the
relevant page.  Or to edit down the URL manually until it looks like
something functional but without tracking information.

Yes, all of this is sometimes onerous.  But I'm accustomed to it: it's
become routine.  And yes, none of this is a panacea -- or even close.
I'm becoming increasingly convinced that we need a new web browser
that's designed from the day it's a blank sheet of paper to incorporate
all of these tools/techniques (and others) in order to deal with the
contemporary threat environment.

---rsk



More information about the Novalug mailing list