57°F

Aaron Parecki

  • Articles
  • Notes
  • Projects

Articles

2018-10-27 Some OwnYourSwarm Updates
2018-10-15 Portable Wireless Live Video Rig
2018-08-01 The Steno Gherkin
2018-07-07 OAuth for the Open Web
2018-06-30 Sending your First Webmention from Scratch
2018-06-23 Your final updates for IndieWeb Summit!
2018-06-03 Improving the HTML type="url" Field
2018-05-27 Dropping Twitter Support on IndieAuth.com
2018-05-26 You're Invited to IndieWeb Summit!
2018-04-20 An IndieWeb reader: My new home on the internet
2018-04-09 A MetaWeblog to Micropub Gateway
2018-04-01 First Quarter 2018 in Review
2018-03-29 php-mf2 v0.4.3: Optional HTML5 Support
2018-03-12 Building an IndieWeb Reader
2018-02-07 IndieAuth-Client-PHP 0.3.1
2018-02-05 OwnYourGram Updates
2018-01-23 WebSub and IndieAuth Published on w3.org!
2018-01-21 Pixel Art!
2018-01-09 OAuth 2.0 Simplified Is Now Available On Kindle!
2018-01-06 Owning my Code Snippets
← older
  • Some OwnYourSwarm Updates

    Sat, Oct 27, 2018 3:40pm -07:00

    Today I launched some updates to OwnYourSwarm, the service that sends your Swarm checkins to your own website. It does this by watching your Swarm account and sending checkins to your site via Micropub. 

    Private Posts

    I made two changes to how OwnYourSwarm can handle private posts. Private posts are currently an experimental feature in Micropub, accomplished by adding a new property to posts called "visibility". The WordPress Micropub plugin already has support for this property, so if you use WordPress you can start using this feature today!

    If you mark a checkin in Swarm as "off-the-grid", now OwnYourSwarm will include visibility=private in the Micropub request. Your website can recognize this property and handle setting the post as private.

    There is an additional checkbox to have OwnYourSwarm always mark the post as private. You can use this to import all your checkins to your site as private posts in case you don't want all your checkins public on your website.

    Automatically Add Tags to Posts

    Like OwnYourGram, there is now a way to have OwnYourSwarm always include a list of tags in your checkins. You can use this to automatically set your checkins into a specific page on your website by adding the tag "checkin" for example.

    Webmention Settings for Comments

    By default, OwnYourSwarm will always try to send Webmentions for any responses to your checkins, such as when someone likes or comments on your checkin, as well as when Swarm itself shows the little tidbits like "11 weeks in a row at coffee shops" awarding coins.

    There are now two additional settings you can use to decide whether you want to receive Webmentions for either type of responses. If you'd prefer to not get the Swarm coins, you can disable that. If you'd prefer not to have anybody else's comments appear on your checkins, you can disable that as well.

    I also added a user agent string when OwnYourSwarm sends webmentions so that you'll be able to identify the HTTP requests in your server logs.

    Hope you enjoy continuing to own your checkins!

    Portland, Oregon • 62°F
    8 likes 2 reposts
    #ownyourswarm #indieweb
    Sat, Oct 27, 2018 3:40pm -07:00
  • Portable Wireless Live Video Rig

    Mon, Oct 15, 2018 9:01am -07:00

    I've been looking for ways to slim down the amount of equipment I need to bring to record conferences talks, both to make it easier to travel to other cities, as well as to speed up the setup time during an event.

    This post outlines my current favorite set of hardware for recording and livestreaming conference talks and meetups. I'm able to fit all of this into a backpack and carry it on my bicycle to local events, or take it on planes to events in another country.

    Video Switcher

    The SlingStudio system has been a total game changer, packing an unbelievable amount of power into a tiny box.

    At the core of the system is the SlingStudio Hub. This is the brains of the operation: a video switcher, encoder, and recorder. This device broadcasts its own wifi hotspot, which you can then connect your cameras and controllers to. You can use any HDMI camera as a video source by using a CameraLink to wirelessly connect the HDMI device, or you can use any iOS or Android device as a camera as well. This provides a great opportunity to have a super compact rig, since iPhone cameras are actually pretty good now.

    For small productions, I will bring one iPhone camera, and one camcorder connected via HDMI. The iPhone provides a good enough picture for a wide shot of the room, and the camcorder provides good optical zoom and low light capability for a close-up of the presenter.

    The rear of the device provides a few ports, most importantly an HDMI input and audio input, which means you have a built-in way to grab the slides from the presenter as well as a good audio feed.

    I usually plug in an HDMI 1x2 splitter into the input, so that I can send the presenter's computer to the house projector as well as this device. This makes the computer show up as a camera angle in the switcher. The hub can accept a huge variety of HDMI resolutions as the input and it handles scaling itself. I haven't yet found a computer that this device couldn't handle.

    Switcher Controller

    The Hub is controlled via an iPad or Mac app. I usually use an iPad since it's nice having a dedicated device with a touch screen for this, plus it's easier to walk around with the iPad.

    You start by connecting the iPad to the Hub's wifi hotspot that it broadcasts, then when you launch the SlingStudio Console app, it will connect to the Hub and provide you with a controller interface to see all the camera angles and switch between them.

    Cameras

    Since my goal is to have this pack up entirely in a backpack, I wanted to find the smallest options for cameras, even if it comes somewhat at the expense of quality. I typically use one or two iPhone SEs ($100 used), and a Canon Vixia HF R500 ($200). (The R500s are discontinued, replaced by the Canon Vixia HF R800, which is only a minor upgrade).

    Close-Up

    I use the Canon for a close-up shot of the presenter, since it has optical zoom and is pretty good at low light. The camera itself doesn't weigh much, so it can fit on a small tripod. I use a short tripod with a monopod extension, which has the benefit of having a pretty small footprint. This wouldn't hold up a DSLR when fully extended, but handles the Canon just fine.

    The Canon camera also has an audio input, so I can connect a wireless microphone receiver to this such as the super compact Sony ECMAW4 Bluetooth microphone.

    The Canon provides a mini HDMI output, which connects to the micro HDMI input on the SlingStudio CameraLink transmitter. The CameraLink connects wirelessly to the Hub, so I can place the camera wherever. The battery in the CameraLink lasts a couple hours, long enough that I don't need to worry about it for a short talk, but if I'm going to be filming for a whole evening I'll make sure to connect it to micro USB power or at least an external battery pack.

    Wide/Rear View

    With one good close-up view of the presenter, I just need a wide view as a secondary or fallback camera angle. I usually place this camera in the back of the room so that the audience as well as the projector are visible. Since I have both a close-up of the presenter as well as their slides brought in directly, I can get away with using an iPhone as this camera angle despite its slightly reduced quality.

    The iPhone SE has a pretty decent camera. It's the same cameras as in the iPhone 6s line, but you can get a used iPhone SE for about $100 now, making this the cheapest way to get another camera angle into the mix.

    To use the iPhone as a camera, you first connect the iPhone to the Hub's wifi hotspot that it broadcasts. Then you launch the SlingStudio Capture app and it will instantly show up as a camera angle in the Hub. You can long-press on the iPhone screen to lock the focus and exposure as well, which is useful when your presenters have slides that switch between white and black backgrounds, which would otherwise confuse the auto-focus and auto-exposure that the phone does.

    Running the camera and wifi on the iPhones the whole event would drain the battery in about a half hour, so I always make sure to plug in the phone during the event. The stock charger cable is usually too short to do anything with, so I bring a 10' lightning cable which gives me enough length to run it to a power outlet somewhere nearby.

    Audio

    Audio is of course a huge part of getting a good quality recording of a presentation. The camera mics built in to camcorders or the iPhone will not get good results at all, since they aren't that good to begin with plus the devices will be typically 10+ feet from the presenter. Instead, you need a microphone super close to the presenter, like a lav mic or handheld mic.

    Depending on the situation, I have a few different setups I use for capturing audio. If the venue is providing amplification for the presenter, then I first try to find a way to tap into the house audio. The Hub has an 1/8" input, so I just need to make sure to place the Hub close enough to the house mixing board to run an audio cable to it.

    If I need to bring my own microphones and audio gear, or if I'm recording a discussion around a table, then I'll bring either a wireless mic or wired stage or boundary microphones.

    Wireless Microphones

    The smallest wireless mic I've found is the Sony ECMAW4 Bluetooth microphone. The transmitter and receiver are both the same shape, both only slightly larger than the AAA battery that powers them. It's relatively inexpensive too, at $150. There is a microphone built in, but it also has an 1/8" jack to connect a lav mic. You definitely get better audio using a lav mic, so if you can manage asking the presenters to wear that I highly recommend it.

    You can connect the receiver to either the SlingStudio Hub or the Canon camera using a 1/8" cable.

    Tabletop Microphone

    If you need to capture audio around a whole table, or if you can't get your presenters to wear a microphone, then the best option is to place a boundary microphone on the podium.

    Audio Technica makes a fantastic wireless mic system that runs on 2.4ghz rather than a dedicated wireless mic frequency. There are some new FCC regulations coming that will re-allocate the frequencies that many wireless mics use, so it will no longer be legal to operate many of those. This system uses 2.4ghz, the same frequencies that Bluetooth and Wi-Fi use, so will always be safe to use.

    Pair the ATW-T1006 Boundary Mic with the ATW-R1700 Receiver and connect the receiver to the Hub. The benefit of using a wireless mic for this, of course, is you don't need to worry about placing the Hub nearby the stage or podium. This also cuts down on the amount of wires you need to bring, which can save a lot of packing space.

    Wired Microphones

    For one reason or another, you may find it better to use a wired microphone. A wired microphone will usually provide better audio quality and be more stable than wireless mics, though it does come at the cost of more wires to carry and more setup time to connect them.

    The Saramonic SmartRig+ 2-Channel Mixer is a small mixer that provides phantom power so you can use nice microphones with it. It plugs into the 1/8" jack on the SlingStudio hub. It does run on a 9V battery with no external power option, although I haven't been able to drain the battery during a normal one-day event yet.

    I have two microphones I usually use depending on the situation.

    • Shure Boundary Condenser Microphone - This microphone will pick up everyone sitting around a table. You can also set it on a podium to not worry about your presenters fiddling with microphones. If you have a really large room, connect two of these to the mixer.
    • Shure SM58 - The Shure SM58 is a long-time standard in stage microphones. It works best when the speaker is a few inches away from the mic, and it does an amazing job of isolating sounds to avoid background noise. You'll need to either ask your presenters to hold the microphone, or give them a stand that places the mic close to their mouth.

    Livestreaming

    Once all the audio and video sources are connected to the hub, you can press record and everything gets recorded as both the mixed track as well as individually. If you make a mistake while live cutting between camera angles, you can always recover by grabbing the original footage from the camera angle you need.

    The other amazing thing is the SlingStudio Hub also has built-in streaming capabilities. It can connect to the venue's wifi, and then connect to Facebook or YouTube to broadcast a livestream. This makes it super easy to both stream an event live, while also recording the raw video for later editing.

    There's not much more to say other than that, pretty much you just connect to a wifi hotspot and press stream. I've had good luck even streaming from an iPhone's wifi hotspot. You can choose the bitrate to stream at, anything from 2-5 mbps will give you a good result.

    I'm super impressed that this device lets me pack so much into a single package instead of using separate devices for each. This all fits into a backpack, along with my computer and other electronics I normally bring.

    Published Videos

    Here are some videos I've produced with this rig so you can see the final results!

    • Donut.js April 2018
    • IndieWeb Summit 2018
    Portland, Oregon
    16 likes 4 reposts 4 replies 5 mentions
    #video #livestream
    Mon, Oct 15, 2018 9:01am -07:00
  • The Steno Gherkin

    Wed, Aug 1, 2018 7:50pm -07:00

    Ever since Donut.js has had live captions for their talks every month, I've been curious about how people can learn to type that fast. That led me down quite the rabbit hole of internet research, where I stumbled into the open source stenography community, largely pioneered by Mirabai Knight, who does the captions for Donut.js! (small world!)

    I thought it would be fun to try to learn some basic stenography, so I set out to find a keyboard for this. There are several existing QWERTY keyboards you can use, (the keyboard needs to be able to support pressing several keys at the same time, which most keyboards, especially laptop keyboards, do not), and there are a few custom keyboards you can build as well.

    I first learned about the Gherkin keyboard from the Plover Blog. It's a tiny keyboard with 3 rows of 10 keys, where the keys are arranged in a grid rather than staggered like typical keyboards. It's commonly referred to as a "30% keyboard" in the mechanical keyboard community. This key arrangement lends itself well to the steno key layout, which looks like this.

    The steno keyboard requires 24 or 26 keys (depending on whether you count the S and * as one or two keys), so this 30-key keyboard has more than enough to handle it.

    Being brand new to stenography, as well as the mechanical keyboard community, it was a real trick to learn all the new terminology needed to piece this together. I haven't actually bought a keyboard in a long time, much less assembled one from scratch.

    The Gherkin kit is really well done, but that's still way more soldering than I am comfortable doing myself. Thankfully, Paul on the Plover chat was offering to do all the soldering if anyone would just ship all the parts to his address. I gladly took him up on that offer, and quickly placed an order from three different online stores for all the pieces.

    Electronics Part List

    • Gherkin Kit from Spacecat.design (just the PCBs)
    • Gateron Switches (30 Clear) from Flashquark (this is the actual key mechanism)
    • Arduino Pro Micro
    • 1N4148 Diodes (50)

    Eventually these all arrived at Paul's house, and he sent some in-progress pictures while he was soldering.

    In order to get the whole thing to be thinner, he even filed down the solder joints on the Arduino that's on the bottom, to save a few millimeters.

    Final Assembly

    The soldered board showed up in my mailbox, which was pretty magical. (Thanks again Paul!) 

    I then needed to attach the two plates and add the keycaps.

    Mechanical Part List

    • Blue and Grey G20 Keycaps from pimpmykeyboard.com (30 Blue BCT, 10 Grey GDE)
    • M2 x 4mm Screws (28)
    • M2 x 5mm Standoffs (14)
    • Clear rubber feet

    The keycaps were the hardest part to figure out, since there are so many options. I needed ones compatible with the switches I got, which are the "Cherry MX" style. I settled on the "G20" style, which are slimmer than normal keyboards, and perfectly square and flat. I bought two different colors, using the blue ones for the keys that are actually on the steno keyboard, and grey for the unused keys.

    It took me a few tries to get screws and standoffs of the right size, since I wanted this to be as thin as possible. I ended up with 5mm standoffs, and 4mm screws. Adding the thickness of the PCB, the screws end up almost meeting in the middle of the standoffs, and it leaves just barely enough room for the Arduino sandwiched between the two boards.

    Lastly, I wanted to find a nice case for this. Turns out there is a case for a portable hard drive that fits this surprisingly well, once you cut out the little middle flap.

    So that's been an exciting adventure so far! Now I just have to get through the somewhat daunting number of steno lessons and practice practice practice!

    Portland, Oregon
    3 likes 1 mention
    #steno #plover #gherkin #keyboard
    Wed, Aug 1, 2018 7:50pm -07:00
  • OAuth for the Open Web

    Sat, Jul 7, 2018 9:30am -07:00

    OAuth has become the de facto standard for authorization and authentication on the web. Nearly every company with an API used by third party developers has implemented OAuth to enable people to build apps on top of it.

    While OAuth is a great framework for this, the way it has ended up being used is much more centralized and closed than prior efforts like OpenID 1. Every service that spins up an OAuth-enabled API ends up being its own isolated system. For example, if I want to build an app that can read someone's step count from FitBit, I have to first go register as a developer on FitBit's website in order to get API keys to use with their OAuth API. 

    This works okay for major services like Google, Twitter, Facebook, and even FitBit, but breaks down when you start to consider use cases like having someone's personal WordPress blog be its own OAuth server. If I want to build an app that lets you upload photos to your WordPress site, I'm obviously not going to be able to register for API keys on everyone's own WordPress installations. Enabling third party clients to be built against systems like WordPress or Mastodon opens up a huge possibility for some really interesting things. The trick is always how do these apps authenticate the user or obtain a token they can use to access those APIs.

    This post details a few specific challenges with OAuth preventing it from being used by independent websites, as well as the solutions to each.

    Client Registration

    The first major hurdle to overcome is the need for the developer to register to get API keys for the service. In a world where everyone's own website is its own OAuth server, it's obviously not practical to have an app developer register API keys at each.

    In OAuth, client registration gives us a few specific things:

    • Provides a unique ID that is used to identify the app throughout the OAuth process, called the client ID
    • Provides a place to enter the name and icon for the app which is displayed during login
    • Registers one or more redirect URLs for security
    • For "confidential clients" (web server apps), registration also provides the client with a client secret

    Note that in traditional OAuth, client secrets are not used by mobile apps or JavaScript apps, and OAuth servers will often not even issue secrets to those types of apps. Since we're trying to avoid registration entirely, we can also just avoid using client secrets at all, and leverage the same protections OAuth already has in place for clients that can't use a secret.

    In order to avoid registration, we need a solution for the first three bullet points above.

    Client ID: Every application needs a unique identifier. If we're talking about turning every website into an OAuth provider, we need a way to have globally unique identifiers for every OAuth app. It turns out we already have a mechanism for this: URLs! In this Open Web version of OAuth, client IDs can be the application's URL. For web-based apps, this is straightforward, as it's simply the website the app is running on. For native apps, this can be the application's "about" page.

    Application name and icon: Since the application's client ID is a URL, we can assume every application has a web page that talks about it, and treat that web page as the place the client defines its own metadata like name and icon. A simple way to accomplish this is with Microformats, so that the application's web page just needs to add a couple classes around the name and icon of the app. This is currently documented and implemented as the h-app microformat.

    Redirect URL registration: This one is a bit more subtle. The purpose of redirect URL registration is to prevent an attacker from tricking an authorization server into sending authorization codes to the attacker. This becomes especially important when we aren't using client secrets. The trick is that since client IDs are already URLs, we can shortcut the normal registration process by declaring a rule that redirect URLs have to be on the same domain as the client ID. This way, we can avoid a situation where an application claiming to be good.example.com sets a redirect URL to attacker.example.org and steals the authorization code. The only way to get the authorization code to attacker.example.org would be to set the client ID to that domain as well, which a user would hopefully notice.

    User Accounts

    There are two different situations to consider with regards to user accounts: authentication and authorization. Authentication is the process of proving the identity of the person signing in. Authorization is how an application obtains permission to do something to someone's account.

    When we talk about authentication, we are talking about wanting to allow an unknown user to identify themselves to the site they're logging in to. Common examples of this are using your email address as your identity to sign in to a website. You bring an existing identity (your email address) and then authenticate (usually by clicking a link that was sent to your email). The original version of OpenID was created to solve this problem on the web. People identified themselves with a URL, which they were able to prove they controlled using OpenID. This allows a new user to log in to a site without needing a prior relationship with the site.

    When we talk about authorization, the situation is subtly different. In this case, we're talking about a user of a website wanting to give permission to a third-party app to access some part of their account. We're very used to this pattern now, which is the typical OAuth use case of granting an application the ability to access your Google Calendar, or logging in to a third party Twitter app.

    Authorization: There isn't really a challenge unique OAuth on the Open Web with regards to authorization. Once the client registration problem is solved, everything else falls into place nicely. It is assumed that users are authorizing an application to access an account they already have, so the application will just end up with an access token that works with their existing account.

    Authentication: Where we need to define some new behavior is talking about authentication. In this case, we want users to be able to bring an existing identity and use it to log in to other places. This means we need a way to uniquely identify users across the entire web. We can again use URLs as the solution! Every user is identified by a URL. This can be a short URL like someone's domain name, e.g. https://aaronparecki.com/, or for a site with multiple users, can be a URL that contains a path specifying a particular user on the site, e.g. https://github.com/aaronpk. 

    Discovery

    With traditional OAuth services, discovery is not needed since the application author knows which OAuth server they're talking to before they start building the app. There is typically a "Sign in with ____" button in the application that begins the authorization process. In the case of using OAuth for authentication, the common pattern is to include buttons for several common "social login" providers such as Facebook, Google, Twitter and LinkedIn. Before the "social login" space essentially consolidated to these four, there were sometimes a dozen of these buttons on an application's login page, which eventually became known as the "NASCAR problem".

    In a world where every WordPress or Gitlab site is its own OAuth provider, there obviously can't be a button for each on a login screen. Instead, we need to find out from the user which server to use to authenticate them. 

    Since we previously stated that every user identifier is a URL, we can ask the user to enter their URL in the sign-in screen, and then fetch that URL and discover their authorization server from there.

    Once we've found the user's authorization endpoint, we can start a normal OAuth request and send them to their server to authenticate. When the server redirects back to the application, it will go and verify the authorization code with their authorization endpoint just like normally happens with OAuth.

    Knowing Who Logged In

    While knowing any user identity information is technically not part of OAuth, we do need the server to return a user identifier when using OAuth for authentication. In practice, most applications also want at least a unique user identifier in the authorization case as well.

    We've previously said that user identifiers are URLs, which solves the global user identity problem, and gives us a mechanism to discover the user's OAuth server. So all we need is a way to return this information to the application after the user has authenticated. 

    OAuth gives us an easy opportunity to return this to the application: in the access token response when the application sends the authorization code to obtain an access token. The server can at that point return the full user identifier of the user that logged in. As long as the domain name matches the domain that the user entered at the start, the application can consider it successful. This also gives the authorization server the opportunity to canonicalize the user identifier, correcting "http" to "https", or adding a path component to the user's profile URL.

    Let's do this!

    By now, hopefully you're thinking "this sounds great, Aaron, someone should write this up as a OAuth extension!" I'm glad you asked!

    The IndieAuth spec, an OAuth 2.0 extension

    Earlier this year, I wrote this all up as an extension to OAuth 2.0, called IndieAuth. IndieAuth encapsulates these small additions needed for OAuth 2.0 to work in the Open Web.

    Despite this spec being published in January, it has actually been implemented for several years before that. There are many implementations of this extension on the server side, everything from standalone authorization server projects, to a WordPress plugin, and it's even implemented by a commercial service, Micro.blog. As far as consuming apps, nearly every Micropub app has implemented this for logging users in.

    For further details on implementing this extension, there are several guides available depending on whether you're writing a client, a server, or just part of a server.

    • Authenticating users with IndieAuth
    • Obtaining an access token with IndieAuth
    • Creating an Authorization Endpoint
    • Creating a Token Endpoint

    There are a few existing open source projects you can use to get started if you don't want to write your own!

    • selfauth - a standalone authorization server using a simple password login
    • IndieAuth for WordPress - a plugin that turns your WordPress install into an OAuth 2.0 server
    • IndieAuth for Drupal - a Drupal plugin that provides a built-in OAuth 2.0 server
    • Acquiescence - an authorization endpoint written in Ruby that authenticates users via GitHub

    For further reading, check out the IndieAuth spec. Feel free to drop in to the IndieWeb chat if you'd like to talk about this, or you can reach me on Twitter or from my website.

    Portland, Oregon • 72°F
    68 likes 32 reposts 14 replies 64 mentions
    #indieauth #oauth #oauth2 #indieweb
    Sat, Jul 7, 2018 9:30am -07:00
  • Sending your First Webmention from Scratch

    Sat, Jun 30, 2018 8:35pm -07:00

    Webmention is one of the fundamental indieweb building blocks. It enables rich interactions between websites, like posting a comment or favorite on one site from another site. This post will walk you through the simplest way to get started sending webmentions to other sites so that you can use your own site to join the conversations happening on the Indie Web.

    So what do you need to walk through this tutorial? We'll use static files and simple command line tools so that you can easily adapt this to any environment or programming language later.

    Get started

    First, we'll create a new HTML file that we'll use to contain the comment to post. At the very minimum, that file will need to contain a link to the post we're replying to.

    <!doctype html>
    <meta charset="utf-8">
    <title>Hello World</title>
    <body>
      <p>in reply to: <a href="https://aaronparecki.com/2018/06/30/11/your-first-webmention">@aaronpk</a></p>
      <p>Trying out this guide to sending webmentions</p>
    </body>
    

    Go ahead and copy that HTML and save it into a new file on your web server, for example: https://aaronpk.com/reply.html. Take your new post's URL and paste it into the webmention form at the bottom of this post. After a few seconds, reload this page and you should see your post show up under "Other Mentions"!

    First Reply

    Making it look better

    That's a great start! But you might be wondering where your comment text is. To make your comment show up better on other peoples' websites, you'll need to add a little bit of HTML markup to tell the site where your comment text is and to add your name and photo.

    Let's take the HTML from before and add a couple pieces.

    <!doctype html>
    <meta charset="utf-8">
    <title>Hello World</title>
    <body>
      <div class="h-entry">
        <p>in reply to: <a class="u-in-reply-to" href="https://aaronparecki.com/2018/06/30/11/your-first-webmention">@aaronpk</a></p>
        <p class="e-content">Trying out this guide to sending webmentions</p>
      </div>
    </body>
    

    Note the parts added in green. These are Microformats! This tells the site that's receiving your webmention where to find specific parts of your post. We first wrap the whole post in a <div class="h-entry"> to indicate that this is a post. Then we add a class to the <a> tag of the post we're replying to, as well as a class to the element that contains our reply text.

    Now, take your URL and paste it into the webmention form below again. After a few seconds, reload the page and your reply should look more complete here!

    Second Reply

    Now we see the text of the reply, and also notice that it moved out of the "Other Mentions" section and shows up along with the rest of the replies!

    Of course this web page still looks pretty plain on your own website, but that's up to you to make it look however you like for visitors visiting your website! As long as you leave the h-entry and other Microformats in your post, you can add additional markup and style the page however you like!

    Adding your name and photo

    Let's make the comment appear with your name and photo now! To do this, you'll need to add a little section to your web page that indicates who wrote the post.

    In Microformats, the author of a post is represented as an h-cards. An h-card is another type of object like h-entry, but is intended to represent people or places instead of posts. Below is a simple h-card that we'll add to the post.

    <div class="h-card">
      <img src="https://aaronpk.com/images/aaronpk.jpg" class="u-photo" width="40">
      <a href="https://aaronpk.com/" class="u-url p-name">Aaron Parecki</a>
    </div>
    

    When we add this h-card into the post we've written, we need to tell it that this h-card is the author of the post. To do that, add the class u-author before the h-card class like the example below.

    <!doctype html>
    <meta charset="utf-8">
    <title>Hello World</title>
    <body>
      <div class="h-entry">
        <div class="u-author h-card">
          <img src="https://aaronpk.com/images/aaronpk.jpg" class="u-photo" width="40">
          <a href="https://aaronpk.com/" class="u-url p-name">Aaron Parecki</a>
        </div>
        <p>in reply to: <a class="u-in-reply-to" href="https://aaronparecki.com/2018/06/30/11/your-first-webmention">@aaronpk</a></p>
        <p class="e-content">Trying out this guide to sending webmentions</p>
      </div>
    </body>
    

    Now when you re-send the webmention, the receiver will find your author name, photo and URL and show it in the comment!

    Second Reply

    Great job! If you've successfully gotten this far, you're now able to comment on things and even RSVP to events using your own website!

    One more detail that you'll want to include on your posts is the date that your post was written. This will ensure the receiving website shows the correct timestamp of your post. If you eventually incorporate this into a static site generator or CMS where you show a list of your replies all on one page, then you'll also want to add a permalink to the individual reply in this post. Typically an easy way to solve both is with the markup below.

    <a href="https://aaronpk.com/reply.html" class="u-url">
      <time class="dt-published" datetime="2018-06-30T17:15:00-0700">July 30, 2018</time>
    </a>
    

    We can add that to the post below the content.

    <!doctype html>
    <meta charset="utf-8">
    <title>Hello World</title>
    <body>
      <div class="h-entry">
        <div class="u-author h-card">
          <img src="https://aaronpk.com/images/aaronpk.jpg" class="u-photo" width="40">
          <a href="https://aaronpk.com/" class="u-url p-name">Aaron Parecki</a>
        </div>
        <p>in reply to: <a class="u-in-reply-to" href="https://aaronparecki.com/2018/06/30/11/your-first-webmention">@aaronpk</a></p>
        <p class="e-content">Trying out this guide to sending webmentions</p>
        <p>
          <a href="https://aaronpk.com/reply.html" class="u-url">
            <time class="dt-published" datetime="2018-06-30T17:15:00-0700">July 30, 2018</time>
          </a>
        </p>
      </div>
    </body>
    

    Automatically sending webmentions

    The last piece to the puzzle is having your website send webmentions automatically when a new post is created.

    This part will require writing some code in your particular language of choice. You'll start by making an HTTP request to get the contents of the page you're replying to, then looking in the response for the webmention endpoint.

    We can simulate this on the command line using curl and grep.

    curl -si https://aaronparecki.com/2018/06/30/11/your-first-webmention | grep rel=\"webmention\"
    

    The response will include any HTTP Link headers or HTML <link> tags that have a rel value of "webmention".

    Link: <https://webmention.io/aaronpk/webmention>; rel="webmention"
    <link rel="webmention" href="https://webmention.io/aaronpk/webmention">
    

    If you get more than one, the first one wins. You'll need to extract the URL from the tag and then send the webmention there.

    Sending a webmention is just a simple POST request to the webmention endpoint with two URLs: the URL of your post (source) and the URL of the post you're replying to (target).

    curl -si https://webmention.io/aaronpk/webmention \
      -d source=https://aaronpk.com/reply.html \
      -d target=https://aaronparecki.com/2018/06/30/11/your-first-webmention
    

    The only significant part of the response is the HTTP response code. Any 2xx response code is considered a success. You'll most often receive either a 202 which indicates that the webmention processing is happening asynchronously, or if the receiver processes webmentions synchronously and everything worked, you'll get a 201 or 200.

    In practice, you'll probably use a library for discovering the endpoint and sending the webmention, so here are a few pointers to start you out in a variety of languages.

    • Ruby
    • PHP
    • Node
    • Python
    • Go
    • Elixir
    • ...more on indieweb.org

    Hopefully this guide was helpful to get you going in the right direction!

    If you want to dive into the weeds, check out the Webmention spec as well as more details on reply posts.

    When you want to put your automatic webmention sending implementation to the test, try sending webmentions to all of the links on the test suite, webmention.rocks!

    If you have any questions or run into any issues, feel free to ping me or anyone else in the IndieWeb chat!

    11 likes 1 repost 5 bookmarks 13 replies 27 mentions
    #webmention #indieweb #tutorial #microformats
    Sat, Jun 30, 2018 8:35pm -07:00
  • IndieWeb Summit 2018 Attendees

    Your final updates for IndieWeb Summit!

    Sat, Jun 23, 2018 1:01pm -07:00

    Hello! Only a couple more days before we all come together for IndieWeb Summit! I am very much looking forward to the week, and I hope you all have a fantastic time while you're in Portland.

    Here are a few last details before we meet. You'll probably want to save this update so you can refer to it as you make your way to Portland for the festivities.

    Weather

    It looks like we'll have some pretty great weather, much less hot than last year! The forecast is showing a high of 75 and low of 50, with a chance of rain Monday morning and dry the rest of the week. I'm looking forward to sharing Portland's nice weather with you in contrast to last year's 100+ degree days!

    weather forecast

    Breakfast and Lunch

    We will be providing bagels, fruit, and protein bars on Tuesday and Wednesday mornings thanks to GoDaddy. Coffee will be available all day thanks to Mozilla!

    Lunch on Wednesday will be provided at the venue thanks to Okta. We'll have a taco bar with gluten free, vegetarian and vegan options! Tuesday lunch will be on your own at any of the nearby food carts.

    Monday Pre-Party

    venue map

    On Monday evening, we'll be hosting a pre-party at Pine Street Market in downtown Portland starting at 5:30pm.

    We'll have drink tickets available thanks to Name.com, and anyone is welcome to come to the pre-party!

    Pine Street Market has a variety of food and drink options, including burgers and veggie burgers, ramen, pizzas, bratwursts and German pretzels as well as Salt & Straw ice cream, excellent coffee, smoothies and fantastic cocktails.

    Getting Around

    We'll be at the Eliot Center at 1126 SW Salmon St, in the Buchan Building. The doors open at 9:00am on Tuesday and Wednesday.

    Please note the entrance is on Salmon Street. If you enter from another street you'll be quickly lost inside the rest of the First Unitarian Church. The picture below shows what our entrance looks like. Look for either "1226", or "Buchan Building". We'll also have an IndieWeb Summit sign on the door.

    Eliot Center Buchan Building

    Public Transit

    There is a streetcar that runs North/South along 10th and 11th Avenues that will drop off a couple blocks from the venue.

    TriMet runs the buses and light rail here. The cheapest way to get into town from the airport is to take the MAX red line which picks up at the airport and drops off at Pioneer Square downtown. It's $2.50 for a 2.5 hour ticket or $5 for an all-day ticket.

    If your phone supports Apple Pay or Google Pay, you can just tap your phone to buy a Tri-Met or Streetcar ticket using the "Hop" readers at every stop! It's by far the easiest option. Make sure you tap your phone at the stop before boarding. You don't need to tap out when you get off. Otherwise, you can use a credit card or cash to buy a ticket at most MAX and Streetcar stops.

    Remote Participation

    We will have people participating remotely who couldn't come to Portland this year! The main room will be set up for remote participation with cameras and screen sharing, and all the sessions will be recorded. The two breakout rooms will have a camera setup for remote participation as well. More information and the relevant links are on the wiki.

    Related Events

    Donut.js

    On Tuesday night, Donut.js is having their monthly meetup featuring a fun collection of talks! Despite the name, only one of this month's talks is actually about JavaScript. The event starts at 6pm and there are three 15-minute talks. Donuts and La Croix are provided. Your IndieWeb Summit ticket will let you register for Donut.js for no additional cost! Just use the coupon code INDIEWEBDONUTS when you register at donutjs.club/tickets.

    Open Source Bridge

    Friday is the 10th and final year of Open Source Bridge. If you're still in town on Friday, this will be a great event to attend! You can read the Open Source Bridge blog for more details about what to expect. You will need to register separately for this event.

    Join the Chat

    If you haven't already, join our Slack room or IRC channel (they're connected) and introduce yourself! We'll be using the chat during the event to take notes, share links, and communicate with the remote participants.

    Code of Conduct

    As a reminder, we have a Code of Conduct that applies to IndieWeb spaces both online and offline.

    Thanks to our Supporters

    I'd like to give a shout-out to our sponsors who make all of this possible! A huge thanks to GoDaddy, Name.com, Okta, Mozilla, and Bridgy, as well as our monthly supporters and everyone who contributed specifically for this event! Also special thanks to our venue sponsor Stumptown Syndicate.

    That's all for now! Looking forward to seeing everyone on Monday!

    Portland, Oregon • 68°F
    1 like 1 reply
    #indieweb #indiewebsummit
    Sat, Jun 23, 2018 1:01pm -07:00
  • Improving the HTML type="url" Field

    Sun, Jun 3, 2018 7:50am -07:00

    Using the HTML <input type="url"> field is normally a good idea when you're asking the user to enter a URL. It doesn't make a huge difference on desktop browsers, but makes it a lot easier on mobile browsers. On iOS, when you've focused on a URL input field, the keyboard switches to a slightly different layout with different keys optimized for entering URLs.

    The URL type keyboard on iOS provides easy access to special characters used in URLs.

    This is great for things like web sign-in where you're asking the user to enter their domain name to sign in. However, for some reason, browsers have implemented the URL validation a bit too strictly. If you don't enter a URL scheme, you'll get an error like the below if you try to submit the form.

    This is pretty irritating because it forces the user to enter the URL scheme http:// or https://, which ironically on the special iOS URL keyboard, requires tapping the "123" button in order to get to the screen to type the ":" character. It would be nice if the URL field accepted plain domain names and defaulted to http://.

    I wrote a bit of JavaScript that will prepend http:// to the value of any <input type="url"> form fields on blur.

    <script>
      /* add http:// to URL fields on blur or when enter is pressed */
      document.addEventListener('DOMContentLoaded', function() {
        function addDefaultScheme(target) {
          if(target.value.match(/^(?!https?:).+\..+/)) {
            target.value = "http://"+target.value;
          }
        }
        var elements = document.querySelectorAll("input[type=url]");
        Array.prototype.forEach.call(elements, function(el, i){
          el.addEventListener("blur", function(e){
            addDefaultScheme(e.target);
          });
          el.addEventListener("keydown", function(e){
            if(e.keyCode == 13) {
              addDefaultScheme(e.target);
            }
          });
        });
      });
    
    </script>
    

    I wish browsers would implement this themselves, but in the mean time you can use this bit of JS to provide a bit better user experience when asking your users for URLs.

    Portland, Oregon • 56°F
    3 likes 2 replies
    #indieweb #websignin #indielogin
    Sun, Jun 3, 2018 7:50am -07:00
  • Dropping Twitter Support on IndieAuth.com

    Sun, May 27, 2018 5:01pm -07:00

    I've made the difficult decision to drop support for Twitter authentication on IndieAuth.com. Some time last week, Twitter rolled out a change to the website which broke how IndieAuth.com verifies that a website and Twitter account belong to the same person.

    Since I am already in the process of replacing IndieAuth.com with two new websites (lots of discussion on the wiki), it is not worth the effort to do what it would take to fix this for IndieAuth.com.

    What Changed on Twitter.com

    In order to verify that you are the person behind the URL you initially type in, IndieAuth.com checks your website to find a link to a Twitter profile, then checks that Twitter profile to see if it links back to your website. If there is a match, then you'll see the green button for Twitter on IndieAuth.com.

    Twitter rolled out a change that prevents normal HTTP requests from returning actual HTML on Twitter profiles. I'm assuming this is part of their effort to fight bots, but it's unfortunate this use case got caught up in that mess. If you visit your Twitter profile in a browser and click "view source", you'll see something like this now.

    This is a delightful bit of HTML that sets a cookie via Javascript and then reloads the page. Presumably this happens so quickly that normally you won't notice it.

    Fetching a profile URL with curl now returns an empty HTTP body.

    Even if I go through the hoops to make IndieAuth.com set cookies and refresh the page, there's no guarantee that they won't just change this again next week. I don't like playing these games, so instead I am just shutting off Twitter support in IndieAuth.com.

    Replacing IndieAuth.com

    The new version that you'll eventually use to sign in to the IndieWeb wiki is called IndieLogin.com. It is currently in beta, and is not available to other developers, but you can try signing in to the test page there right now. This new version gets around this Twitter problem by not even attempting to fetch Twitter profile pages in the first place.

    The new login flow works like this:

    • You enter your website on IndieLogin.com
    • IndieLogin.com finds your Twitter profile by checking all rel=me links for one matching twitter.com
    • IndieLogin.com shows you a button to authenticate with Twitter immediately (rather than first checking that your Twitter profile links back)
    • After you authenticate on Twitter and are redirected back to IndieLogin.com, it fetches your Twitter profile from the Twitter API
    • If your Twitter profile as reported by the API includes the initial website you started with, then you're authenticated

    This avoids the problem because IndieLogin.com never tries to fetch your Twitter profile HTML. Instead, it uses the API directly. This does mean that you can get into a situation where IndieLogin.com may prompt you with a Twitter button that can fail (if you are logged in to a different Twitter account than the one your website links to). However, it also speeds up the initial login prompt since it doesn't have to go check Twitter before showing you the login button first.

    Hopefully I'll be able to launch IndieLogin.com soon so that the lack of Twitter support on IndieAuth.com isn't too annoying. In the mean time, you can authenticate via GitHub or email on IndieAuth.com.

    Portland, Oregon • 80°F
    1 like 3 replies 1 mention
    #indieauth #indielogin #twitter
    Sun, May 27, 2018 5:01pm -07:00
  • You're Invited to IndieWeb Summit!

    Sat, May 26, 2018 6:40pm -07:00

    IndieWeb Summit is soon, and is shaping up to be an exciting event! We're hosting IndieWeb Summit the same week as the (final) Open Source Bridge, in case you needed another reason to visit Portland! IndieWeb Summit will be Tuesday-Wednesday June 26-27th, with a pre-party the Monday evening before.

    If you're at all interested in taking back ownership of your online data, decentralizing the web, independent blogging, or any aspect of having a website, you should consider joining us for this event!

    As gRegorLove said so well:

    It’s a really friendly, collaborative group of people and it is always inspiring to see what people are making.

    You don’t need to be a programmer! In fact, I would love to see more non-programmers attending. We need writers, graphic artists, designers, UX engineers, and anybody that wants to reclaim some of their online presence with a personal website.

    Keynotes

    One of the distinguishing features of IndieWeb Summit compared to the IndieWebCamp events we run in many other cities throughout the year is we begin day 1 with a few keynote presentations to help set the stage for the two days. This year we're featuring a few special guests during the keynotes.

    Manton Reece will give a talk about how Micro.blog works with open standards to encourage people to own their data while also making a service that is incredibly fun and easy to use.

    William Hertling, the author of Kill Process, a book that features the IndieWeb, will talk about his inspiration for writing the book and where he sees the future of the IndieWeb heading.

    We've been seeing some exciting progress with IndieWeb readers over the last few months, between my reader Monocle, Eddie's iOS app "Indigenous", and Jonathan and Grant's app "Together". We'll be sharing the latest developments along that front as well!

    Related Events

    In addition to IndieWeb Summit, the whole week will be a great lineup of events!

    • Monday, June 25th 5:30pm - Pre-summit meetup at Pine Street Market
    • Tuesday, June 26th 9am-5:30pm - IndieWeb Summit Day 1 - Keynotes and Discussions
    • Tuesday evening 6:30pm - Donut.js
    • Wednesday, June 27th 9am-5:30pm - IndieWeb Summit Day 2 - Create, Hack, Demos!
    • Friday, June 29th - Open Source Bridge unconference and party

    I hope to see you there! You can register now at 2018.indieweb.org!

    3 likes 1 repost 7 replies
    #indieweb #indiewebsummit #indiewebcamp
    Sat, May 26, 2018 6:40pm -07:00
  • An IndieWeb reader: My new home on the internet

    Fri, Apr 20, 2018 9:00am -07:00
    This article was originally posted on the GoDaddy blog.

    I have a new home on the internet. I don’t visit the Twitter home timeline or the Facebook news feed anymore. I don’t open the Instagram app except when I post a photo. I still have accounts there — I just don’t visit those sites anymore. Instead, I have my own new space on the internet where everything I’m interested in is consolidated, and I can read and reply to things from there. But before I go too far into my new online home — an IndieWeb reader — some background.

    The problem with algorithmic timelines

    It used to be the case that when you opened Twitter, you’d see every tweet from everyone you’re following, in order, with the newest at the top.

    Over the past few years, Facebook, Twitter, Instagram and many other services have switched to what’s known as an “algorithmic timeline,” meaning posts no longer show up in chronological order. Instead, these services use proprietary algorithms to decide what to show you and when.

    They now decide what content is more important for you to see, and even interject ads into your timelines.

    You could argue that they’ve done us a favor in one sense. I stopped being able to keep up with my chronological Twitter timeline long ago. But doesn’t it seem wrong that Twitter gets to be the one to decide what to show me? Plenty of people are upset about the new algorithmic timelines, even posting articles like “Instagram is actively ruining my life with its inhumane algorithm” and “14 Ways to Outsmart the Instagram Algorithm.”

    We clearly need a way to take back control of what we’re reading online.

    Evolving RSS readers

    You might remember RSS readers did a pretty good job of giving individuals control of what they are subscribed to. However, over the years, Twitter, Instagram and many other social media platforms have shown us that people enjoy reading and sharing short-form content, not just blog posts.

    Twitter not only provides an easy way to post content online, it also provides a single place to read what everyone else has posted. More importantly, these sites also enable you to quickly respond to the things you’re following. Whether that’s clicking the heart icon to show your support of a post, or writing a reply to something as you’re reading it.

    RSS readers have failed to adapt to the changes in how we create and consume content online.

    They are largely stuck in the blogging era, being used to consume blogs and news sites. If we want to have any hope of the open web and independent websites replacing our own use of Twitter and Facebook, we need to be able to have experiences at least as good as we have on those services.

    Building better readers

    What if you could reply to a blog post in your feed reader, and your reply would show up as a comment on the original post automatically? What if you could click a “heart” in your reader, and the author of the post would see it? What if you had one place to go to follow not just your Twitter friends, but also all of your friends’ blogs, their microblogs, and see the pictures they’re sharing? What if you could have seamless conversations in your reader the way you have seamless conversations on Twitter today?

    These are the things myself and the IndieWeb community have been making huge progress on in recent years.

    Here’s a screenshot of what my current IndieWeb reader looks like:

    My IndieWeb reader looks kind of like a combination of an RSS reader and a Twitter feed. An important difference between this and a traditional RSS reader is that this interface has buttons I can click to reply to posts!

    When I click Reply, the IndieWeb reader creates a post on my website, and notifies the person I’m replying to so their site can show it as a comment.

    Here’s my reply on my website:

    And here’s my reply showing up on the original post:

    I’m pretty happy with how my current IndieWeb reader is working right now! I’ve built it as a thin interface on top of a server-side API that handles all the feed fetching. In fact, there are alternative front-ends that work with the same server. When I’m on my phone, I can use a native iPhone app to see all the same content that I see on my computer.

    How the IndieWeb Reader works

    This separation between the reader interface and the server is critical to developing a new generation of readers. This is what lets us have the choice of using multiple different reader apps, all accessing the same data behind the scenes. This separation is documented in a spec called Microsub.

    The server side of the reader is software selected by the user. This might be built into their website CMS, or could be a separate service they sign up for. Typically, the server side won’t have much in the way of a user interface, likely just enough interface to subscribe to some feeds, but it doesn’t need to be able to display any of the content itself.

    Reader apps can then be built without needing to spend any time dealing with parsing different feed formats or worrying about having enough resources to poll all the feeds people are subscribed to.

    The reader apps become simple clients talking to the user’s feed fetching server.

    Check out my post, Building an IndieWeb Reader, for more details on how all the pieces fit together. If you’re building a reader, check out the Microsub spec to learn how you can participate in this growing ecosystem.

    What's next?

    This part of the IndieWeb ecosystem is still in the early stages. I would love to see more development of both the reader apps and also the backend servers! If you use a CMS, consider installing or writing a plugin to add support for Webmention, Microsub and Micropub. If you’re an app developer, this would be a great time to build new Micropub apps to help people post to their websites, or build new Microsub apps with interesting and unique interfaces.

    Another fun challenge I’m looking forward to tackling soon is the ability to post and follow private content using our websites. The OAuth 2.0 extension IndieAuth provides us a solid base to work from.

    As always, I’m happy to chat about any and all of this. It’s been a lot of fun already to build this all out and see it working! You can find me in the IndieWeb chat via IRC and Slack.

    This June, we’re hosting the annual IndieWeb Summit in Portland. IndieWeb Summit is for independent web creators of all kinds — from graphic artists to designers, UX engineers, coders and hackers — and is where we brainstorm and create lots of things like the IndieWeb reader. Head over to 2018.indieweb.org for more information and to register!

    15 likes 5 reposts 15 replies 11 mentions
    #indieweb #monocle #reader #microsub
    Fri, Apr 20, 2018 9:00am -07:00
  • A MetaWeblog to Micropub Gateway

    Mon, Apr 9, 2018 9:12am -07:00

    I’m always looking for fun and better ways to publish content to my website. There are several nice writing apps now, such as Byword for MacOS, which lets you write in Markdown and then converts it into HTML. Many of these kinds of apps have an option to publish to a Wordpress site, using Wordpress’ XML-RPC interface, which is more or less the MetaWeblog API.

    I thought it would be a fun experiment to try to set up my website to handle those XML-RPC calls so that I can use these apps, at least until they support Micropub natively.

    Problems with XML-RPC

    As with most XML formats, XML-RPC is ridiculously verbose for what it’s doing. In order to send a single string value, it requires wrapping it in a surprising amount of XML:

    <param>
    <value><string>aaronparecki.com</string></value>
    </param>
    

    The MetaWeblog API also requires that you give apps your Wordpress password, which has been a known anti-pattern for a long time. With things like OAuth, we have better ways of authenticating against APIs without sending passwords around.

    Micropub Bridge

    To avoid needing to add actual XML-RPC support to my website, I set up a bridge that translates XML-RPC calls to their equivalent Micropub calls. The bridge also obtains an IndieAuth access token so that I can use a token instead of entering my password into these apps.

    I didn’t bother adding any styling to the gateway since it’s something you’d only interact with once to set up, so apologies for the ugly screenshots that follow.

    Here’s what it looks like to connect Byword to be able to publish to my website.

    First I visit the gateway and log in.

    That does the IndieAuth discovery and takes me to my website where I grant it access.

    Then I’m redirected back to the gateway which provides instructions on what to do next.

    The key to making Byword find the XML-RPC API is adding that EditURI tag to my website. Then I can go into Byword and add a new Wordpress site, entering my domain name.

    It then asks for my username and password, which I enter by copying from the gateway.

    Then Byword is all set and thinks it’s talking to a Wordpress site!

    Now when I’m ready to publish this post, I click the “Publish” button in Byword, and fill in the title and tags.

    Open Source

    If you’d like to try this out, head to xmlrpc.p3k.io and connect your website!

    The source code is available on GitHub.

    I’ve only implemented the one newPost method that Byword uses when talking to the XML-RPC API. I’ve only tested this with Byword, so it’s very likely that other apps might be expecting more of the API to be implemented. Feel free to file issues if you have trouble using other apps! Eventually I’d like to implement more of the MetaWeblog API in this gateway, even if I really don’t like writing XML-RPC code!

    Portland, Oregon • 47°F
    8 likes 1 repost 4 replies
    #micropub #indieweb #xmlrpc #metaweblog
    Mon, Apr 9, 2018 9:12am -07:00
  • First Quarter 2018 in Review

    Sun, Apr 1, 2018 4:42pm -07:00

    January

    Events

    Went to Baltimore to help put on IndieWebCamp! It was a lot of fun, and I even added a couple fun things to my website during the second day.

    I also filmed the talks at the DonutJS meetup.

    Podcasts

    I managed to publish only one episode of my podcast, Percolator, just before heading to Baltimore.

    We launched applications for the StreamPDX Podcast Fellowship Program in January! We received way more applications than we expected!

    IndieWeb Projects

    We published the final version of WebSub, and the IndieAuth note, on w3.org! Thanks to the hard work of the Social Web Working Group for all their contributions!

    My Website

    I made several improvements to my website during January!

    • Updated my Life Stack post
    • Launched the collaborative pixel art on my home page
    • Added support for posting code snippets to my site to switch off of gist.github.com
    • Added a summary of my blog post archives so the archive pages are easier to use
    • Added meta tags so my site looks better when links are posted to Twitter/Facebook/Slack

    Other Stuff

    Finally got my OAuth 2.0 book launched for Kindle! It turns out that the Kindle requirements made it a lot more work than just uploading the existing ePub version.

    February

    Events

    Okta hosted Iterate 2018, where I went and had a great time chatting with people about OAuth and giving out copies of my book.

    Podcasts

    Again I managed to publish only one episode of Percolator during the month.

    The StreamPDX team reviewed all the applications to the fellowship program, and it was really tough to narrow them down! We ended up inviting a handful of people in for interviews, and chose 4 out of that group. I began working with them on their podcasts, making pretty good progress the first few weeks!

    I also taught the first Publishing your Podcast class of the season.

    IndieWeb Projects

    I made a lot of progress on my new IndieWeb reader during February! I wanted to get it in shape enough to use it during the conferences I was attending. I decided to split it into two parts, a Microsub server (Aperture) with no UI for viewing posts, and a separate client that has no storage backend of its own (Monocle).

    I added a minor feature to OwnYourGram, made some minor changes to XRay, and released an updated version of the PHP IndieAuth client.

    March

    Big news in March! I accepted a full-time job at Okta! I've been working with Okta for quite some time now, but always part time as a contractor. I've written up more about what I'll be doing at Okta on the Okta Developers blog!

    Events

    • Co-hosted the first Homebrew Microblog Meetup with Jean
    • Went to the PDXNode Hack Night and did a lightning talk about Monocle, my IndieWeb reader
    • Filmed the talks at DonutJS, but had some technical issues with the audio, so those videos aren't nearly as good this month

    IndieWeb Projects

    I made lots of progress on Monocle, getting it to a point where I now can use it every day as my primary home on the Internet. I wrote a blog post describing how everything works, Building an IndieWeb Reader.

    Thanks to the hard work of gRegor and Martijn, we were able to get a new release of the PHP Microformats parser out the door! This is now in use by Monocle, which should improve a lot of the feeds it's seeing.

    Podcasts

    Percolator is turning into a monthly podcast, as I managed to again get only one episode out during March.

    The StreamPDX fellowship program is continuing, I've been writing some music for one of the podcasts which has been fun, but a lot of work.

    We brought the StreamPDX trailer to the Portland Art Museum to record audio during an event!

    I taught another session of Publishing Your Podcast.

    Other Stuff

    I finally set up an account at exist.io! I went through the list of all their supported integrations, and decided to customize a bunch of them.

    Since I post notes and photos to places other than Twitter and Instagram, my website is the canonical source of my tweets and photos as well as the responses I get from them. I was able to use the Exist API to take over writing those values and now my Tweet/Instagram counts in Exist actually reflect the notes and photos I post to my own website.

    I noticed they also support tracking miles biked, so since my bike rides are already on my website, I set up a script to push that data to Exist!

    I also track other kinds of transport, and decided to use their custom tracking to visualize that per day. So now I can see at a glance which days I was on a bike, in a taxi, on a train, etc! It's pretty neat looking already, and I'm hoping they'll be able to be used in some insights later!

    The Exist API even has a section for tracking money spent, although they don't integrate with YNAB (yet!). I got beta access to the YNAB API and was able to wire it up to report my spending from certain budget categories into Exist!

    Portland, Oregon • 43°F
    1 like 3 mentions
    #review
    Sun, Apr 1, 2018 4:42pm -07:00
  • php-mf2 v0.4.3: Optional HTML5 Support

    Thu, Mar 29, 2018 11:42am -07:00

    This release includes support for using an alternative HTML parser that understands HTML5 tags.

    v0.4.3

    The built-in HTML parser does not understand some HTML5 tags such as <article>, which causes issues when those tags are adjacent to elements that can be automatically closed, such as <p>. A simple example that is incorrectly parsed with a non-HTML5 parser is below.

    <div class="h-entry">
      <p class="p-name">Hello World
      <article class="e-content">The content of the blog post</article>
    </div>
    

    Without proper knowledge of HTML5 tags such as <article>, the contents of that tag ends up inside the p-name in this example. Using an HTML5 parser will properly close the <p> tag and return the expected result.

    The php-mf2 library does not automatically install the HTML5 parser, since it does not want to impose additional dependencies on your code base. If you wish to use the HTML5 parser, you can include it in your project explicitly, either by loading it manually or using composer:

    composer require masterminds/html5
    

    If this library is present, then the php-mf2 parser will use it when parsing HTML.

    Portland, Oregon • 51°F
    #microformats
    Thu, Mar 29, 2018 11:42am -07:00
  • Building an IndieWeb Reader

    Mon, Mar 12, 2018 5:03pm -07:00

    Over the last several months, I've been slowly putting the pieces in place to be able to build a solid indieweb reader. Today, I feel like I finally have enough in place to consider this functional enough that I am now using it every day!

    One of the major missing pieces of the IndieWeb ecosystem has been having an integrated reading and posting experience that mirrors the ease with which it's possible to post and follow on Twitter and other silo apps. 

    We've seen a few attempts at indieweb readers over the past few years, but nothing has really taken off or stuck around. Even my own attempts at readers have fallen apart, both the previous iteration of Monocle in 2016, and my fork of selfoss in 2014. My suspicion has always been that we haven't seen many people building out this part of the ecosystem because it turns out there are a whole bunch of different parts to building a reader, many of which have no overlap in skillset: managing the subscription list, polling and fetching feeds, parsing feeds, data storage, rendering posts in a UI, providing inline action buttons to be able to reply and favorite posts, etc. 

    When I'm building out the UI components of a project, the last thing I want to have to think about is the invisible backend stuff like feed polling and parsing. Similarly when I'm tackling the problems with parsing and normalizing data from feeds, the last thing I am thinking about is Javascript button interactions. Not to mention that I am barely an iOS developer, so there's no way I'd be able to build out a full indieweb reader for iOS.

    In April 2017, I started outlining a spec that draws a hard line between these very different parts of building a reader, and called it Microsub (for subscribing), as a complement to Micropub (for publishing). The basic idea is to separate the feed subscriptions from the UI parts of building a reader. 

    I started working from the ground up on building out the various aspects I knew I would need in order to eventually end up with a fully functional reader. 

    The main interface of Monocle

    I based a lot of these design decisions around my previous experience in building a reader, as well as my documentation of how I use IRC to read content across the web.

    My goal with this is to use this as my primary online dashboard to follow all kinds of content, as well as being able to interact with the content without leaving the interface. 

    Channels

    The main organization of the reader is laid out in "channels". You can also think of these as "folders" if you want. Many feeds (or sources) can be added to a channel, and the posts are all combined into a single timeline view.

    Displaying Posts

    Monocle supports displaying a few different types of content in posts. It has native support for notes, articles, photos, multi-photos, videos, audio clips, checkins, favorites, reposts, and replies.

    Since I follow my Instagram feed in the reader, I wanted to have a good display for photos, especially when there are many photos attached to a single post. I ended up doing a simple custom layout when there are two or three photos. Four or more they just start tiling as half-size photos. With three photos, the first photo appears larger on the left and the other two are stacked to the right.

    I wanted to be able to read full articles in the reader without jumping out to the site, but also didn't want to have to scroll endlessly when I'm just skimming headlines. So if an article has content that is too tall, it gets truncated with a "read more" link to expand it.

    If the post has an audio file, such as podcasts, then there is a simple HTML audio player inline!

    Post Actions

    Each post has a set of buttons to be able to respond to the post. The quickly accessible actions are "favorite", "repost", and "reply". The three dots opens up an expanded menu with some additional options, some of which I have not yet implemented.

    Currently it's possible to remove a post from a channel, and to open up a debug view showing the raw data behind the post. Eventually this will expand to include muting the author, blocking the author, or unfollowing the source the post came from.

    Replying

    Clicking the "reply" button drops down a little text box for posting a reply.

    There is a character counter just so that I have a sense of how long the post is. Since this is posting the reply back to my website, this interface has no idea what sort of character limits there are, so it's just a simple counter. When I click the "Reply" button, the app makes a Micropub post to my website to create the post there. 

    My website already has all the logic for adding that to my replies feed and sending webmentions to the post I'm replying to. Since this post I replied to is on Micro.blog, and Micro.blog accepts webmentions, the post showed up there within a few seconds.

    The same workflow happens for favoriting and reposting things.

    Syndicating to GitHub

    My website also recognizes when the post I'm replying to is a Twitter or GitHub permalink, and will automatically syndicate my reply or favorite appropriately! Since I added my GitHub notifications to a channel, I can actually reply to GitHub issues directly from the interface!

    Replying to a GitHub issue comment from Monocle
    My reply to the GitHub issue on my website
    My reply automatically syndicated to GitHub

    Read-State Tracking

    You may have noticed the little blue dots next to the channel names in some previous screenshots. Those indicate how many unread posts are in the channel.

     

    However, some feeds that I follow end up with tons of posts in the channels, so many that the actual number of posts is no longer significant! All I really want to know is whether there is something new or not. To account for this, I can choose whether a channel implements per-item tracking, or just a boolean read/unread indicator, or disables read tracking altogether.

    When I'm looking at a timeline, any new posts appear with a yellow glow around them. As the post scrolls out of view, it gets marked as read, and that state is pushed to the server so that other clients will also know it's now read.

    I really enjoy not having to manually mark things as read, instead the interface just handles it all for me without any additional interaction.

    Multiple Website Support

    Since I actually have several different websites I use, I wanted the response buttons to be able to post not just to my website, but also to the other websites I have. For example, my cat Dora, who has her own website is not always the best at using the computer, so sometimes I have to favorite things for her.

    I can choose a alternate default account per channel so that the response buttons will actually post to the alternate website. Notice Dora's cat face in the bottom right corner of the screen. This lets me know that interacting with posts in this channel will be posted to this alternate account.

    I can even temporarily switch to a different account by clicking on the profile icon and choosing another account.

    Simple Posting Interface

    You may also have noticed the little pen icon in the lower left corner. Clicking that pops up a dialog for writing a new post from the selected account. I chose to keep this interface super simple, providing just a text box and character counter. 

    If I need to write something more complicated, such as including HTML content, adding a photo, or choosing where the post is syndicated, then I'll just pop over to Quill and write the post there instead.

    Multiple Apps

    I mentioned earlier that there were many parts to this, and I haven't talked much about that yet. The most important thing about the architecture of this system is that it is not just a single monolithic app. Instead, there is a server responsible for collecting all the data from the feeds I'm following, and separate apps for displaying them! Since I was documenting everything on the wiki as I was building this out, other people were able to jump in and start writing clients from the beginning! 

    There are already two other great interfaces that work with the same backend server! 

    Here is a Javascript app called Together, written by Grant, showing the same posts you saw in a previous screenshot.

    Here is the same content rendered by the iOS app, Indigenous, written by Eddie.

    I'm pretty thrilled that already we've been able to have two people jump in and build readers so quickly already, thanks to the hard work of feed fetching being abstracted away by the server!

    The Microsub Server

    Now to start getting into the technical bits of how this works. Feel free to skip this section if specs make your eyes glaze over.

    I mentioned before that the main separation going on here is splitting off the feed fetching and parsing from rendering the posts.

    This accomplishes a few things:

    • Enables app developers to focus on the UI aspects of building a reader
    • Allows you to choose which service you want to use to manage your subscriptions
    • Enables you to use many different reader apps all talking to the same server backend that you control
    • Leaves room for servers to do experimental things with feeds and subscriptions (think "magic" or "smart" feeds) without having to bother with the UI components or needing to get clients to add support

    The main idea behind this is the Microsub spec. This is the spec that the Microsub server implements so that clients know how they can talk to it. 

    Ideally there will eventually be a large ecosystem around the spec, with many clients to choose from, and many servers as well. We'll see some projects build in Microsub support natively, so that they work out of the box, and we'll also see some dedicated feed subscription services support Microsub. The nice thing about using your website identity to tie the pieces together is that you can choose your Microsub server separately from choosing the software that powers your website.

    For example, I decided early on when building my website that I didn't want to mix the idea of following feeds into the same software that powers my website. So instead, I wrote an external Microsub server called Aperture, which is responsible for all the feed polling and parsing and storing the posts in channels. Aperture is open source, although I still consider it "in active development", so I am not officially supporting it right now. You are of course welcome to get it running yourself, but be prepared for things to change quickly.

    (Aperture actually has two components, Watchtower which is a microservice that polls feeds and delivers them to Aperture itself, and Aperture does the actual feed parsing with the content provided by Watchtower. This allows me to scale out the feed polling separately from the Microsub server.)

    If you want to try to get Aperture and Watchtower running, Daniel did a pretty great writeup of his experience getting things set up in two posts: Part 1 and Part 2.

    Ideally I would love to see some more implementations of Microsub servers, so head over to the spec if that's your thing!

    Monocle

    Monocle is the Microsub client I wrote that's featured in the screenshots above. It is also open source. Since it doesn't do any feed parsing itself, it doesn't even have a storage backend! Everything is fetched on the fly with the exception of the channel list and Micropub config which is persisted in the session data. 

    When I click on a channel to view it, Monocle first makes a Microsub timeline request to Aperture to fetch the channel data, then renders it in the timeline view. This is analogous to the iOS app Indigenous fetching the timeline data from the Microsub server then rendering it on the phone, except Monocle is doing that server-side to generate HTML for the browser.

    You might think I'm crazy for having written a PHP app that fetches JSON from an API and then renders --gasp-- static HTML in 2018, but guess what -- it's fast!

    Monocle is open source, but I am also hosting a version online that anyone is welcome to use at monocle.p3k.io. Since it doesn't actually store anything itself, I don't expect it to take up any significant resources any time soon! Of course in order to use it, you'll need to have your website pointing to a Microsub server of your choosing. Since that's where all the actual work is done, I am not making my hosted version of Aperture available for general signups right now. You'll need to either get that running on your own server, or build a Microsub server from scratch!

    Putting the Pieces Together

    This last section has been a bit of a wall of text, so here is a diagram showing how all the pieces fit together to make this possible!

    My website contains the IndieAuth and Micropub bits, but others have chosen to use external services for those as well. I've also chosen to outsource sending and receiving webmentions to external services, whereas other people end up handling those within their own website code as well.

    The reader apps all talk to the Microsub server that I've linked to in order to view posts, and when I tap a "like" button or write a reply from the app, they post that to my Micropub server to create a new post.

    I use webmention.io to handle my incoming webmentions, and it is configured to send posts to a channel in Aperture using Aperture's Micropub API.

    Further Reading

    This has been a very long read, so congrats if you've made it this far! Here are some links if you're curious about how you can start building out various pieces of the ecosystem as well!

    Specs

    • Microsub - a standardized way for apps to consume and interact with feeds collected by a server
    • Micropub - a W3C Recommendation for creating, updating and deleting posts using external apps
    • Webmention - a W3C Recommendation enabling cross-site comments and other interactions
    • IndieAuth - an OAuth 2.0 extension that enables you to authorize third-party apps to talk to your Micropub or Microsub servers
    • mp-destination - a Micropub extension allowing a server to designate alternate destinations for creating posts

    Open Source Projects

    • Indigenous - an iOS Microsub client
    • Together - a React JS Microsub client
    • Monocle - a server-side PHP Microsub client
    • Aperture - a PHP Microsub server
    • Watchtower - a feed fetching microservice

    Future Work

    While this is all a good start, and I do actually use this as my primary online home now, there is still a lot more work to do!

    • More Microsub servers! I want to see at least two more solid Microsub server implementations in the relatively near future! That will help develop the spec further and ensure we're actually building interoperable tools. I suspect one of the implementations will end up being part of an integrated CMS such as Known or Wordpress, or will be a proxy to an existing feed reader service.
    • Following private content. Private content has always been a challenge, mainly due to the fact that any time authentication is involved it complicates things a lot. With IndieAuth finally written up as a spec, we now have a solid building block to use to experiment in this area more.
    • An Android Microsub app. While both Monocle and Together work pretty well on mobile browsers, there are still many advantages to having a native Android app! And it sounds like one is already in the works.
    • Better UI for actually following people. I've kind of taken a shortcut on this front in order to move the rest forward. Right now, you still need to type someone's URL into a Microsub app in order to follow them. There are many challenges with streamlining this process further.

    As always, I'm happy to chat about any and all of this! It's been a lot of fun already to build this all out and see it working! You can find me in the IndieWeb chat via IRC and Slack, if you send me a Webmention I'll see it in my reader, or find me at an upcoming IndieWeb event!

    Portland, Oregon • 67°F
    54 likes 10 reposts 2 bookmarks 28 replies 29 mentions 1 RSVP
    #indieweb #monocle #aperture #microsub #micropub #watchtower #reader
    Mon, Mar 12, 2018 5:03pm -07:00
  • IndieAuth-Client-PHP 0.3.1

    Wed, Feb 7, 2018 11:30am -08:00

    This release includes two new methods for quickly developing an IndieAuth client.

    The library can now handle all the boilerplate work of generating a state parameter, URL canonicalization, and state management between the request and callback.

    Developing an IndieAuth client now requires just setting a few configuration variables and deciding how to show error messages in your application. See the code below for an example of using the new features!

    index.php

    <form action="/login.php" method="post">
      <input type="url" name="url">
      <input type="submit" value="Log In">
    </form>
    

    login.php

    <?php
    require('vendor/autoload.php');
    if(!isset($_POST['url'])) {
      die('Missing URL');
    }
    
    // Start a session for the library to be able to save state between requests.
    session_start();
    
    // You'll need to set up two pieces of information before you can use the client,
    // the client ID and and the redirect URL.
    // The client ID should be the home page of your app.
    IndieAuth\Client::$clientID = 'https://example.com/client/';
    
    // The redirect URL is where the user will be returned to after they approve the request.
    IndieAuth\Client::$redirectURL = 'https://example.com/client/redirect.php';
    
    // Pass the user's URL and your requested scope to the client.
    // If you are writing a Micropub client, you should include at least the "create" scope.
    // If you are just trying to log the user in, you can omit the second parameter.
    list($authorizationURL, $error) = IndieAuth\Client::begin($_POST['url']);
    // or list($authorizationURL, $error) = IndieAuth\Client::begin($_POST['url']);
    
    // Check whether the library was able to discover the necessary endpoints
    if($error) {
      echo "<p>Error: ".$error['error']."</p>";
      echo "<p>".$error['error_description']."</p>";
    } else {
      // Redirect the user to their authorization endpoint
      header('Location: '.$authorizationURL);
    }
    

    redirect.php

    <?php
    require('vendor/autoload.php');
    
    session_start();
    IndieAuth\Client::$clientID = 'https://example.com/client/';
    IndieAuth\Client::$redirectURL = 'https://example.com/client/redirect.php';
    
    list($user, $error) = IndieAuth\Client::complete($_GET);
    
    if($error) {
      echo "<p>Error: ".$error['error']."</p>";
      echo "<p>".$error['error_description']."</p>";
    } else {
      // Login succeeded!
      // If you requested a scope, then there will be an access token in the response.
      // Otherwise there will just be the user's URL.
      echo "URL: ".$user['me']."<br>";
      if(isset($user['access_token'])) {
        echo "Access Token: ".$user['access_token']."<br>";
        echo "Scope: ".$user['scope']."<br>";
      }
    }
    
    Portland, Oregon • 55°F
    2 replies 1 mention
    #indieauth #indieweb
    Wed, Feb 7, 2018 11:30am -08:00
  • OwnYourGram Updates

    Mon, Feb 5, 2018 10:12am -08:00

    Just pushed a few changes to OwnYourGram this morning! Here's what changed:

    You can now see the schedule of your account so you know when to expect photos! You'll see which tier you're at, and how long until your account is next polled.

    You can now add a list of tags that will be sent along with every photo! This is useful if you want all your photos to be tagged "photo" or "instagram" for example.

    I also updated the documentation to include an example of the JSON post format that OwnYourGram can send. 

    If your site has a Media Endpoint, then OwnYourGram will first upload your photos to your Media Endpoint, and include those URLs in the Micropub request. 

    That's it for now! Happy OwnYourGramming!

    Portland, Oregon • 49°F
    3 likes 1 mention
    #ownyourgram #indieweb #changelog
    Mon, Feb 5, 2018 10:12am -08:00
  • WebSub and IndieAuth Published on w3.org!

    Tue, Jan 23, 2018 6:28pm -08:00

    Today, we published the last of the two W3C specs I am editing! WebSub was published as a W3C Recommendation, and IndieAuth was published as a Working Group Note.

    WebSub

    WebSub is a standardized way for publishers to notify subscribers when new content is available. It was formerly known as PubSubHubbub, which was hard to say, so I'm glad we renamed it. 

    https://www.w3.org/TR/websub/

    One of our goals with WebSub was to ensure that existing PubSubHubbub implementations would still be compliant, so there are already lots of implementations in the wild! If you're publishing content online, want to receive realtime updates when feeds are updated, or are building a tool to facilitate either of these, WebSub is a great fit! You can use the tool at websub.rocks while building your implementation to get immediate feedback!

    Thanks to everyone who contributed to the spec, and especially my co-editor Julien!

    IndieAuth

    IndieAuth is an identity layer on top of OAuth 2.0, and used by Micropub clients. Micropub was published as a W3C Recommendation in May of last year. Today, IndieAuth was published as a W3C Working Group Note. 

    https://www.w3.org/TR/indieauth/

    The Social Web Working Group did not set out with the goal to standardize any sort of authentication mechanism, but since almost all of the Micropub implementations already supported the same mechanism, we decided to publish a "Note" to that effect. (The Micropub implementations that don't use IndieAuth use hard-coded tokens as a shortcut.) Notes are quite different from Recommendations in the eyes of the W3C, as described by this sentence: "The publication of a NOTE by the Consortium implies no endorsement of any kind." The goal of publishing this Note was to capture the current state of interoperable implementations.

    One of the things I like most about the W3C standardization process is that specs are published after they describe things that are working, rather than published as an aspirational blueprint. We kind of pushed that definition to an extreme with IndieAuth, since there have been live interoperable IndieAuth implementations for several years now. Previously, I had written up several guides for how to implement the various roles in the IndieAuth flow, but never written it down as its own spec. The guides were certainly useful, as was clearly demonstrated by the fact that people were following them to build out various parts of the ecosystem. But there is also a need for a spec to lay things out and remove any ambiguities along the way.

    Thanks to everyone who helped iron out the details of the language in the spec! We made a lot of good progress over the last few months!

    What's Next?

    So with these two specs published today, we've taken quite a lot of the IndieWeb building blocks through the W3C process!

    • Webmention - enables direct site-to-site commenting and other interactions
    • Post Type Discovery - once you receive a Webmention, this algorithm helps your site know what to do with the contents
    • Micropub - enables apps to create content on a website
    • WebSub - enables real-time subscriptions to web pages and feeds
    • IndieAuth - a way to log in to sites with your domain name, and allow Micropub apps to post to your site
    • jf2 - a post serialization format used by some Webmention services and other tools

    So here's a few specs and tools that I'm working in the immediate future:

    IndieAuth test suite - Like webmention.rocks, micropub.rocks and websub.rocks, I plan to make a tool that will help test your IndieAuth implementation. It will do things like throw tricky situations at your client to ensure you're handling the edge cases properly.

    Finally finish renaming indieauth.com - I never should have called it that, since it's doing two completely separate things. You can read about the details here.

    Microsub - Microsub is currently an early draft spec. The goal of Microsub is to do for reader interfaces what Micropub did for publishing interfaces. A Microsub server provides a standardized API that reader clients can use to show content. This will help make developing IndieWeb readers a lot easier, and also allows you to keep your subscription list in a server that you control rather than letting the reader own the list.

    Monocle - Monocle is my Microsub implementation. It subscribes to feeds and presents a them as a Microsub server so that I can use any Microsub client to view everything. There's still a lot of work ahead for me here, but it's my goal to finally stop using IRC as a reader by getting Monocle to the point of being functional enough to cover all my use cases.

    As always, you can help me and the rest of the IndieWeb out by adding support for any of these specs on your own website! We are always excited to welcome new people to the IndieWeb chat if you have any questions!

    Portland, Oregon • 47°F
    22 likes 23 reposts 1 bookmark 7 replies 4 mentions
    #websub #indieauth #w3c #standards
    Tue, Jan 23, 2018 6:28pm -08:00
  • Pixel Art!

    Sun, Jan 21, 2018 2:11pm -05:00

    I just finished my IndieWebCamp hack day project, and I'm pretty excited about it!

    A long time ago, my website used to have this 7x7 grid of pixels on the home page, which visitors could toggle between blue and green. It saved the state after you'd click them, so you could leave little pictures for the rest of my website visitors. 

    I eventually abandoned that version of my site, and that feature disappeared as well. I decided that it would be fun to add it back to my current website today!

    So now, my home page has a similar section at the top with a little grid of pixels again!

    There were a few differences in my approach this time around. I decided to make the grid 20 pixels wide by 3 pixels tall, in order to reduce the chances of people being able to spell things or draw anything inappropriate. 

    I wanted the grid to be responsive as well, so that the cells shrink appropriately when the width of the column shrinks. I found this nice answer on StackOverflow, "Grid of Responsive Squares", which pointed me at a technique I hadn't know about, which is to use a percentage for the padding-bottom property. Each cell in my grid is calc(5% - 1px) wide, with padding-bottom: calc(5% - 1px) as well. This makes the height match the width, which is based on the relative size of the container.

    I also made the grid realtime! If you open the home page in two browsers, you'll see one browser update when you click a pixel in the other! I was able to do this without any complicated server-side support thanks to the nginx push-stream module that I already have installed. It lets a browser subscribe to an endpoint using the EventSource API, and then from my server I can send a POST request to the nginx module to broadcast data to anyone listening.

    Maybe my next project will be to get some Neopixels and make a little thing for my desk that always shows the current pattern!

    Baltimore, Maryland • 55°F
    1 like 2 mentions
    #indiewebcamp #pixel #art #p3k
    Sun, Jan 21, 2018 2:11pm -05:00
  • OAuth 2.0 Simplified Subscribers

    OAuth 2.0 Simplified Is Now Available On Kindle!

    Tue, Jan 9, 2018 9:30am -08:00

    OAuth 2.0 Simplified is now available on Kindle!

    I know you've been waiting for the Kindle version, and I'm happy to say it's finally available!

    Buy for Kindle

    While the ePub and PDF have been available for a while, it took a bit more work than I initially thought to prepare the Kindle version.

    This version is formatted specifically for Kindle, so that you can browse the table of contents properly, as well as highlight and share sections of the content.

    Of course, if the Kindle isn't your thing, you can always get the PDF or ePub versions as well! The print edition is also available on Amazon now!

    Portland, Oregon • 42°F
    8 likes 3 reposts 1 mention
    #oauth2simplified #oauth2 #kindle #oauth
    Tue, Jan 9, 2018 9:30am -08:00
  • Owning my Code Snippets

    Sat, Jan 6, 2018 2:49pm -08:00

    It's very convenient to be able to quickly share a link to a code snippet, which is something I've mostly done using gist.github.com. While often the code snippets are sort of throwaway, I still feel bad that I'm posting them in a place that I don't control.

    Today I added support to my site for natively posting code snippets. It wasn't nearly as hard as I was imagining it was going to be! Since my site doesn't have a posting interface of its own, that also meant that I needed to add support to Quill to create these posts via Micropub.

    Since this is still relatively experimental, I didn't add a button to it in Quill, but the page is live.

    The only required part of this is the code box. The interface lets you optionally set a filename, which will automatically set the language if it matches a list of known languages. The language is included in order to indicate to my site which syntax highlighter to use. Here's what that ends up looking like on my site.

    I use the GeSHi syntax highlighter, which covers a wide range of languages and works very well, but I'm not super happy with the colors it uses. I'd like to find a stylesheet or even a different syntax highlighter that looks closer to the colors on GitHub. In the mean time, I'm happy enough with this to use it.

    I ended up doing quite a bit of fiddling with my CSS to make these posts look good. You'll notice the grey background of the code block extends to the edges of the post, whereas normally there would be some padding. I also changed the font of the post name to a fixed-width font. If there is no filename, then I wanted the grey background to extend to the top of the frame, which is also something unique to these posts.

    The last step of making this actually useful is to integrate this into my browser workflow, including editing the posts easily. There is a bookmarklet that will either open up a new window or launch the editing interface depending on whether I'm looking at an existing code post.

    I'm pretty happy with this, and I hope I don't post on GitHub anymore unless I'm specifically using their fork feature of Gists!

    Portland, Oregon • 49°F
    1 reply 2 mentions
    #p3k #indieweb #code
    Sat, Jan 6, 2018 2:49pm -08:00
next

Hi, I'm Aaron Parecki, co-founder of IndieWebCamp. I maintain oauth.net, write and consult about OAuth, and am the editor of several W3C specfications. I record videos for local conferences and help run a podcast studio in Portland.

I wrote 100 songs in 100 days! I've been tracking my location since 2008, and write down everything I eat and drink. I've spoken at conferences around the world about owning your data, OAuth, quantified self, and explained why R is a vowel. Read more.

Follow
  • Okta Developer Advocate
  • IndieWebCamp Founder
  • W3C Editor
  • Stream PDX Co-Founder
  • backpedal.tv

  • W7APK
  • ⭐️ Life Stack
  • All
  • Articles
  • Bookmarks
  • Notes
  • Photos
  • Replies
  • Reviews
  • Sleep
  • Travel
  • Contact
© 1999-2018 by Aaron Parecki. Powered by p3k. This site supports Webmention.
Except where otherwise noted, text content on this site is licensed under a Creative Commons Attribution 3.0 License.
IndieWebCamp Microformats Webmention W3C HTML5 Creative Commons