Reading this: basically, CO2 levels are a measure of how well a space is ventilated, and can therefore be a handy proxy for a rough idea of how likely it could be that there might be infectious particles (flu, COVID, etc.) in the air. Lower CO2 = better ventilation and less chance of any bugs in the air, Higher CO2 = worse ventilation, stale air, and higher chance of other bugs in the air. It’s not a one-to-one connection, obviously, as there are other variables, such as number of people in the area, but it can be a good way to get a rough measure of the ventilation.
So here’s how my day went (all times shifted one hour from what’s shown on the graph due to the time change).
Being able to see this change over the course of the day was fascinating.
Until about 8am, I was at the hotel. Levels stayed in the green and slowly decreased through the night, then increased into the yellow as I woke up and was active and moving around, showering, packing, etc.
8-9am, outside and on the light rail to the airport. Nice and green.
9-noon, in the airport, often in the midst of lots of people as I went through the TSA lines. Even in the large, high-ceilinged airport areas, with lots of room for air to move, levels were generally in the yellow. This is part of why crowded situations, even in large or outdoor areas, are still good places to be masked.
Noon-2pm, on the airplane. Lots of people in a fairly small, confined space. Airplanes might have “good” ventilation, but there’s only so much that can be done, and it was solidly in the red the entire time. I was okay with my KN95 through the airport, but switched to an N95 from just before boarding until after disembarking in Seattle, didn’t eat on the plane, and used a straw when drinking to minimize intake of unfiltered air.
2-3pm: Getting my baggage and taking a Lyft home. Right back into the green.
This was a handy little gadget to have with me this week. That, plus masking, plus vaccination and boosters, and I’m feeling pretty confident in my safety measures.
On this last day of the 2022 Accessing Higher Ground accessibility in higher education conference, I put together a thread about the week. Originally posted on Mastodon, this is a lightly edited version for this blog. Be warned, this isn’t short. :)
Me on my way to the first day of panels.
High-level thoughts from a first-time attendee: This is a really good conference. I haven’t seen much in the way of glitches or issues (discounting the occasional technical electronic weirdness that happens anywhere). Panel content has been well selected and planned; I’ve been able to put together a full schedule with few “this or that” conflicts. Some panelists are better than others, as always, but I haven’t seen any trainwrecks or other disasters.
I do wish the conference had more of a social media presence. The @AHGround Twitter account linked from the AHG website hasn’t posted since 2017, and the #ahg22 hashtag I only found on their Facebook page, and it wasn’t mentioned until 10 days before the conference. Unsurprisingly, this means that there was very little hashtag use (at first I seemed to be one of the very few users other than AGH itself using the tag consistently or at all; a few more people started using it as the conference went on).
The hotel is a Hilton. My primary other hotel experience is the DoubleTree by Hilton Seattle Airport (where Norwescon is held), and I was amused that in most respects, I prefer the DoubleTree to the Hilton Denver City Center. The room is a little smaller here, and I was welcomed at check-in with a room temperature bottle of water instead of a fresh-baked chocolate chip cookie. But these are small and kind of picky distinctions; really, it’s exactly what you’d expect from a Hilton.
Looks about like any other hotel room out there.
That said: This particular hotel has excellent ventilation. I’ve been carrying around an Aranet4 air quality monitor, and it has stayed comfortably in the green nearly the entire time; it has only gone into the low end of the yellow during one standing-room-only session in a smaller room. It did get into the yellow as it sat in my room overnight as I slept, but opening the window would bring it back into the green in just a few minutes (though at 20°-40° F outside, I didn’t do this much).
Being able to keep an eye on CO2 levels was nice, and helped make me feel comfortable with COVID-era conference travel.
As noted in an earlier Mastodon post, the weirdest thing for me has been part of switching from fan convention to professional conference: the lack of anything after about 5 p.m. I’m used to fan-run SF/F cons like Norwescon, with panels running until 9 p.m. or later, evening concerts or dances, 24-hour game spaces, and a general “we’ll sleep when this is done” schedule. Having nothing left for the day after about 5 p.m. is odd, and it feels weird not to know that I could wander out and find things going on.
For people who come with groups and/or have been doing this for a long time and have a lot of connections, I’m sure it’s easy to find colleagues to have dinner or hang out in bars or restaurants (at or outside the hotel) and chat with. But for a new solo attendee, it meant I spent a lot of evenings watching movies on my iPad in my room. (I did find a small group of other Washington-based attendees to hang with one evening, which was very appreciated.)
Impressions of Denver: Hard to say, really. It’s been pretty cold this week (20s to 30s most days), and since a lot of panels caught my eye, I didn’t take time to go exploring beyond going to the 16th street mall to find food. The little I did see in the immediate area is nice enough; maybe I’ll see more if I get to come back to AHG in the future.
Though I haven’t taken German or been to Germany in years, my brain kept labeling this a “Fußgängerzone”.
Colorado itself, I have to say, didn’t give me the greatest first impression. The trip from the airport to downtown Denver is a 40-minute light rail ride through flat, brown, high desert with lots of scrub brush, punctuated by aesthetically unpleasing industrial and commercial areas. Maybe it’s nicer in the summer, but in the winter? The SeaTac-to-Seattle light rail ride is much prettier. (My apologies to Coloradans for snarking on their state.)
Denver has mountains in the distance, they were just out the other side of the train. All I saw was flat.
My least favorite part has been the humidity, or lack thereof. Coming from the Pacific Northwest’s pretty regular 50%+ humidity, having Colorado’s humidity hovering around the 20% level has been horrible on my skin. Even with lotion, I’m itching like crazy, to the point where it’s been difficult to sleep, and my hands are so dry that the skin of my knuckles is cracking and I look like I’ve been punching walls. Whimper, whine, yes, whatever, it’s unpleasant.
But anyway! And now, brief (500-character or fewer) overviews of the sessions I attended while I’ve been here:
InDesign Accessibility (full-day pre-conference session): For a long time, I’ve had a basic impression that PDFs are crap for accessibility. Turns out that PDFs can be made quite accessible, but it takes a bit of work and the right tools, and InDesign is a powerful tool for this sort of thing. While I don’t use InDesign much, I learned a lot about PDF accessibility and how to effectively prepare documents, and many of the concepts will be translatable to other programs. Very useful.
Addendum: I’d also like to take some time to see how many of these techniques and accessibility features are also available in Affinity Publisher, since I’m a fan of Affinity’s alternatives to Adobe’s big three tools (Photoshop, Illustrator, and InDesign). I have no idea how much of a priority Affinity puts on accessibility (either within their tools or the final documents), but it could be an interesting thing to poke around with.
Using NVDA to check PDFs for Accessibility (full-day pre-conference session): Another really useful day. While I’ve known about screen readers as a concept for some time, I’ve just started experimenting with NVDA over the past year, and as a sighted user who doesn’t depend on it, it can be an overwhelming experience. This day gave me a ton of info on tips for using NVDA (including the all-important “shut up for a moment” command), and I’m going to be much more comfortable with it now.
Keynote: Oh, also: The keynote speaker, Elsa Sjunneson, was excellent, speaking about her experiences as a Deafblind person, student, parent, and author. Her statement that “disability is a multiverse” resonated with a lot of people. Plus, it was a treat to see her speak here, as I know of her from her paneling at Norwescon and her Hugo nominations and wins.
Elsa and her interpreter during her keynote speech.
Publishing and EPUB 101: An introduction to EPUBs and an overview of some of the better creation tools. I’ve experimented a bit with creating EPUBs here and there in the past, and am familiar enough with the basics that this one was slightly below my knowledge level, but it still gave me some good tips on methods and tools for preparing documents to be output as accessible EPUB files for distribution.
Math and STEM: Since I’m going to be training STEAM faculty on what they need to know to make their courses accessible, which can have some extra considerations to be aware of, this seemed like an obvious choice. It ended up being basically a demonstration of TextHelp’s Equatio equation editing product, which isn’t necessarily a bad thing, as Equatio does do a lot of neat stuff and our campus already has access to it, so I did learn a lot from the session, even with the single-product focus.
Retire the PDF: An intentionally hyperbolic title, this was a call to consider EPUBs as an alternative to PDFs when distributing documents. As long as you’re not absolutely wedded to the visual layout and presentation of a document, EPUBs do have a lot of advantages over PDFs by giving the end user more control over the display (fonts, sizes, reflow to varying screen sizes, etc.) and better screen reader compatibility (especially when compared to poorly constructed PDFs).
Educational Alt Text: A particularly good session on how to think about writing alt text for images, with an emphasis on doing so for an educational context. Thinking about not simply describing the contents of an image, but creating alt text that conveys the meaning and what information the reader needs to get from the image separate from how the image appears, and how to craft effective alt text and (when technologically possible) long descriptions with more information about the image.
Going Further with EPUB: This session got deeper into the innards of EPUBs, looking at how they’re constructed (essentially self-contained XHTML websites), examining a few different tools for creating, editing, checking, and validating EPUBs for full accessibility. Again, much of the basic info I knew, but the collection of tools and verification options will be very handy to have.
Accessible Math Roadmap: Presenting an in-progress reference document on the state of accessible math and the various tools out there for creating and interacting with equations in accessible formats. As noted above, this is an area I’m trying to learn the basics of as quickly as possible, so I’ll be digging into the reference document itself in more detail in the coming days as I continue preparing to help train faculty on how they can do all this for their classes.
Trending Tech Tools: This is apparently the latest in a recurring series of presentations at this conference, going over major developments in accessible technology over the past year, recent updates to a number of widely used tools, and a peek at things coming down the line in the coming months. Particularly for someone new to the field, this was a nice way to get a snapshot of where things stand and what to be aware of.
Advanced VPAT Techniques: Voluntary Product Accessibility Templates (VPATs) are a way for vendors to declare how accessible their products are (or aren’t); this session discussed how best to approach talking with vendors about their VPATs, particular things to look for, and ways to guide discussions with vendors to get more precise information about issues that may be noted when reviewing the VPATs during the pre-selection product investigation and review phase.
Accessible Videos: Covered what needs to be done to make videos accessible, for both the videos themselves (using high-contrast text within the videos, including correct captions, transcripts, and audio description tracks) and the video players themselves, which need to be accessible and allow full access to all features for all users (which most players, including YouTube’s, aren’t very good at doing). Got some good pointers on automated-caption correction workflows and tools as well.
Integrating Tech in Communication: Through no fault of the presenters, this ended up being the least directly useful to me, as while it was about ways to use tech to communicate with students, it was presented by people on a Microsoft-focused campus, and was essentially a rundown of many of the features built into Microsoft’s applications and how they’re using them on their campus. Not bad info at all, just not as useful for me as it obviously was for others in attendance.
So that wraps up my week at Accessing Higher Ground! It was well worth coming, and I’m very glad I was able to come. If I only get to go to one conference next year, it will probably be the big AHEAD conference (along with ATHEN, one of the two parent organizations for AHG), as they’ll be in Portland, but if we have the resources to send me to two conferences, I definitely hope to come back to AHG again. Thanks to the organizers and all the presenters and attendees for such a good conference week!
TL;DR: Avoiding Mastodon because you’ve heard it’s problematic makes as much sense as avoiding the internet because you’ve heard it’s problematic.
So.
Back in the antediluvian times before the Internet existed — you know, when great beasts like dinosaurs and, um…mastodons…roamed the earth — there were these things called Bulletin Board Systems, or BBSs.
Each BBS was a single computer sitting in someone’s house, connected to a telephone line (the physical kind that came out of the wall). BBS users could use a modem (generally a little box with blinky lights that screamed at you when you started using it, but the really neat but slow early ones you’d actually place an telephone handset into) to place a telephone call from their computer to the BBS computer to see who had posted messages since the last time they called in, respond to those messages if they wanted, and upload or download tiny, low-resolution, 256-color .bmp images, often of impolite subject matter.
The really fancy BBS systems could connect to two or three phone lines at a single time, so that more than one user could log in at the same time. This would let them type back and forth at each other, much like a modern chat session, only they’d have to actually use real words, because this was also before emoji were invented.
Each BBS tended to have its own particular culture and rules. Some BBSs were regional for an area; others might have a Star Trek theme, or a Star Wars theme, or a Dr. Who theme. I think that was it, because those were the only approved geek interests at the time. People with particular interests would join BBSs that supported those interests, so they could have conversations with other people that shared those interests.
Eventually, BBS systems gained the ability to dial into each other and exchange messages. Suddenly conversations could involve not just the users on an individual BBS, but also users on other BBSs. Once a day or so, one BBS would call another one, send a bunch of replies to discussions that had been posted in the past day over, and receive a bunch of replies to discussions.
Of course, even when one Star Trek BBS was talking to another Star Trek BBS, they might not have exactly the same rules. Subjects that were fine one one server might be anathema on the other. Maybe a user who had gotten into a fight with someone on one server had started using another one, but now those two servers were talking to each other. Basically, people are people, and as every good Depecehe Mode listener knows, that doesn’t always work out.
But still, people generally like to meet and talk to other people about things they enjoy (not to mention exchange tiny, low-resolution, 256-color .bmp images of impolite subject matter), and so these differences were dealt with, and different servers found ways to get along. Or, if there were simply too many differences to overcome, the servers would simply stop calling each other to exchange messages.
Basically, we all either figured out how to get along, or if there was a known problem server, we just stopped dealing with it.
But we didn’t say that, “Oh, I heard BBSs were a problem, so I don’t do that.”
Well, okay, sure, I’m sure there were people who had that attitude. But the rest of us knew that you didn’t have to throw the BBS out with the bathwater (there’s a risk of electrical shock when doing that anyway) — all you had to do was ignore the BBS that was the problem, not ignore BBSs altogether.
Fast forward a few decades.
Now every computer talks to every other computer. Some of those computers host discussions from a number of different people. Some of those groups of people are perfectly pleasant, reasonable people, whose only concerns are ensuring that everyone they know has a lifetime supply of puppies, kittens, and rainbows. Some of those groups of people are…otherwise interested.
They all exist on the same internet, but they’re on different systems, using different software, much of which doesn’t easily talk to the other kinds of systems and software out there. So when you run across a part of the internet that has all the appeal of free diving into depths of the New York City sewer system, the easiest solution is to simply not explore that part of the internet. (And hopefully, you escape before attracting their notice, so they don’t follow you wherever you go.)
So we (most of us, at least) don’t avoid the entire internet because we know that there are some parts of it that are not places you’d want to wander through late ate night (or, sometimes, even in the broad light of day).
One of Twitter’s major problems is that it is a monolithic system: If you’re on Twitter, you’re in the same system as every other Twitter user. And because Twitter had dodgy and poorly enforced protocols and methods for protecting its users, there was no good way to say, “I don’t want to deal with this unpleasant group of Twitter users”. Everyone’s in the same room at the same party, and there’s no real way to escape short of leaving the party entirely, even if that means having to abandon all the partygoers that you like to get away from the partygoers with the funny little mustaches who are being jerks.
Mastodon, however, isn’t monolithic. It’s not a single system. This is where you start hearing the words “decentralized” and “federated” and your eyes glaze over, but all that means is, just like the BBSs of ye olden days, it’s a bunch of individual servers that can to talk to each other. The biggest difference is that where in the BBS days, BBS owners had to find each other and set up the connections intentionally (opt-in), Mastodon’s default is for servers to talk to each other unless they choose not to (opt-out).
Some servers are puppies and kittens and rainbows, some aren’t. But when the owner of the Mastodon server puppieskittensrainbows.social realizes that the users from newyorkcitysewer.social keep harassing people, causing problems, and being generally unpleasant, they can just decide not to talk to that server anymore. Poof! Problem solved, no more sewer rats skittering around biting people.
So, is Mastodon a problem? No more than BBSs are a problem, or the Internet is a problem. Individual Mastodon instances may be, but they can be dealt with.
And, of course, nobody can spend $44 billion to run Mastodon into the ground in two weeks.
Leave Twitter or stay*?
If you’re leaving, where are you going?
My response:
For myself, I’ll likely keep my account active (deleting an account doesn’t remove it from being followed, and means that username is up for grabs, which means a bad actor/spammer could grab it and start showing up in the feeds of anyone who was still following that username), but I may start scaling back my usage (even more than I already have). There are still a lot of people on there that I value following, but if the ratio starts to change, so it goes.
My personal preference (at least in an ideal world) is for my own personal website. I own it, I can put what I want on it, and I’ve had a blog running there for more than 20 years now. How frequently I post to it varies depending on how much I’m sucked into Facebook or Twitter at any given time, but I’ve never let it totally die off, and maybe this will (once again) be impetus to start babbling there again.
Of course, the down side to personal blogs is that for “most people”, they’re not as visible — you have to either go to them, or have some form of RSS newsreader set up, which isn’t difficult, but if you don’t know about that as an option, it doesn’t do any good — because they’re not being algorithmically pushed into people’s faces, so you get fewer readers. And without “like” buttons or similar functionality (which I’ve not bothered to figure out how to do on mine), if the readers you do have don’t comment, then you don’t have the gratification of feedback. I’m well aware that this is one of the things that keeps sucking me back to Facebook: I can post the same thing here and on my blog, and I have no idea if anyone ever sees my blog, but here I’ll get reactions and comments.
I also have accounts on both Mastodon and Cohost, and will every so often check back in to see what’s going on there. As always, if more people I know use those more often, I’m more likely to participate more often.
I sincerely believe that learning, growing, examining, and often changing beliefs is an integral part of being a responsible human being. My personal journey socially and politically has been ever leftwards, and there are many posts in the archives that I would not write the same way today, if at all.
Things I know exist in my archives that I would not write today:
General mockery of Britney Spears for no real reason other than being a pop queen. (Which, honestly, she’s very good at.)
Very suburban-white-background “I listen to all kinds of music except country and rap” sentiments. Lots of at-the-time unexamined racism and classism in those statements, plus they were never really all that true (classic country and “acceptable” rap were always part of my listening habits).
Probably a fair amount of other statements with then-unexamined ableism, classism, racism, sexism, homophobic, or transphobic aspects or roots.
I’m sure there is a lot more; those are just the ones that pop into my head because I’ve come across them at one point or another recently while digging into my archives.
I’ve always considered myself to be open-minded and politically liberal, and while that’s true, the older I get, the more I have realized how many ingrained societal biases still exist within that basic framework. Working through those biases, recognizing them, and endeavoring to change them is an ongoing process, and one I hope I never give up on. It’s not always comfortable; it is always necessary.
I’ve been working for the past few days on constructing a Shortcut to use for quickly sending a link and block of text to whatever blogging software I’m using on whichever device I’m on at the moment. As of today, I’ve hit a point where it does everything I wanted it to when I started playing, so I’m designating this an official “version one” release (for posterity’s sake, I suppose I can refer to the prior two versions as the alpha and beta releases).
The Shortcut is now cross-platform, with many thanks to Jason Snell for giving me exactly the final pieces I needed.
Selecting some text on a webpage and then using the Share Sheet on iOS or the Services menu on macOS will grab the webpage link and the selected text, convert it to Markdown format, convert any relative URLs in the selected text to absolute URLs, and then place the final text into a new Ulysses sheet on iOS or MarsEdit post on macOS, all ready for any final edits before publishing to your blog.
If this shortcut might be of use to you, either as-is or with some modifications for your particular needs, download, tweak if necessary, use, and (hopefully) enjoy!
Updated my iOS “Blog This” shortcut (for sending selected text from Safari to Ulysses in Markdown format) to add two fixes: slightly adjusting the output if no text is selected, and expanding relative URLs (my thanks to Memory Alpha for inspiring this fix).
Figured out how to create an iOS shortcut that grabs a webpage URL, title, and any selected text from Safari, formats it into a Markdown link and block quote, and then sends it to Ulysses as a new post to be published to my blog. Pretty happy with the result!
Today I finally finished repairing my Music (iTunes) library after it got mangled when I signed up for Apple Music (the service) a few months ago.
Apple Music has its benefits, but apparently signing up automatically activated the library sync feature, which started overwriting my local metadata with data from the cloud. I caught it before it got all the way through and figured out how to turn it off, but a large chunk of my music library lost a lot of the edits I’d made over the years. From song titles to artist names to custom artwork, covering tracks that I’d purchased from the iTunes Music Store, purchased from Bandcamp, ripped from my own CDs, or even imported from my vinyl collection. Titles and names were changes, artwork was either replaced or removed…probably somewhere between a third and a half of my 37,416 item, 285 GB music library was affected.
The only reason I was even able to repair it all was that, well, Music (and iTunes before it) has been historically tweaky for long enough that I’ve gotten into the habit of making a manual backup of my music library every so often, separate from the Time Machine backup that’s done automatically, just because I don’t trust Music not to screw something up at some point.
I also discovered that Music reads metadata from two places: the metadata embedded in the individual files, and in the “Music Library” file stored within the /user/Music/ folder. Much of the bad data that was being displayed in Music was actually being read from the “Music Library” file; apparently that was where the data from the cloud had been written. When I opened the info window on a track to fix it, Music would then read the embedded metadata from the actual track file, and the data (some of it, at least) would switch back to the correct information.
Of course, manually going through and loading every one of my 37,416 tracks wasn’t at all realistic — but the Refresh a track from its file’s metadata script from Doug’s Applescrpts allowed me to select a chunk (I was able to do as many as 600 tracks at at time without it timing out) and let the script repair the metadata in the background. There were still some final corrections that needed to be made (this trick didn’t fix the artwork that got lost or replaced, and many of the “Album Artist” fields still needed to be corrected manually), but those were easier to do once the script handled the bulk of the work.
So, a few months after signing up for Apple Music, I finally have my local library back to a useable state.
Hey, Apple? Local data should NEVER be replaced by cloud data without warning, without explanation, and without active affirmative confirmation by the user. That was years of work I could have lost, and months of work repairing it. Get this bit of your system fixed, please. This sucked.
Also, trying to write a post about my music, the application Music, the service Apple Music, and Apple the company, and make it all coherent, is not an easy thing to do. I get that iTunes was a bloated beast and needed to be split up — though, really, Music isn’t that much better, is still missing a lot of features (like a usable in-app search feature) — but did it have to be renamed so generically?