51°F 7:26am

Aaron Parecki

  • Articles
  • Notes
  • Projects
  • Eclipse 2017 Day Trip

    Tue, Aug 15, 2017 4:24pm -07:00

    Note: I will be keeping this post up to date based on the latest plans.

    From Hollywood: 4:15am

    Bike 3.7mi (~30m) to Gateway TC, arrive by 4:45.

    From Gateway, attempt to get on any of the following Green Lines to Clackamas TC: 4:56 5:19 5:31 5:44 5:59

    • If all trains are full, bike to Oregon City (14.3mi, 1h 45m) 6:00am-7:45am goo.gl/maps/FYUDCiRcu442

    If all goes well catching the train, will arrive at Clackamas TC before 6:45

    Clackamas TC to Oregon City: 7.5mi ~1 hour

    • goo.gl/maps/vZEnXjvwhLF2 (warning: Google does not know about 205 path all the way)
    • ridewithgps.com/routes/24258104

    Oregon City

    Arrival time in Oregon City will be between 6:20 - 7:20 by train+bike, or 7:45 by bike. Wait in Oregon City for people to meet up.

    • Singer Hill Cafe opens at 6:30

    From Oregon City: 7:45am

    Leave Oregon City before 7:45am. It will be about 14 miles to the riverbank, ~1h 45m. There are two options, either Hwy 213 or back roads.

    route-options

    Back Roads (14 miles)

    Most of the route is single-lane in two directions with no shoulder, and lots of windy roads. There is almost no reason cars would take this route.

    • Summary: goo.gl/maps/c1N9R2LfgRU2
    • Details: ridewithgps.com/routes/24258326

    Hwy 213 (12 miles)

    This is the route advertised on biketotheeclipse.com. It is very straight, and has a shoulder almost the entire way, and has slightly less elevation gain. It is likely to have a lot of car traffic since it's the main route from Oregon City to Molalla. However there is something to be said for riding alongside other bikers.

    • Summary: goo.gl/maps/1fpEcWumFr82
    • Details: ridewithgps.com/routes/24391698

    The two routes meet up for the last mile and a half at this intersection.

    The Destination

    Arrive at the riverbank of Wagonwheel Park at 9:30am, during the partial eclipse. There will still be 45 minutes of partial eclipse until the peak.

    river

    river

    Eclipse Times

    • Start of partial: 09:06:02
    • Start of total: 10:18:40
    • Peak: 10:19:01
    • End of total: 10:19:25
    • End of partial: 11:38:37
    Portland, Oregon
    18 likes 6 replies 1 mention
    #eclipse
    Tue, Aug 15, 2017 4:24pm -07:00
  • Seriously Simple Podcasting: Including all Episodes in the Podcast RSS Feed

    Mon, Jul 24, 2017 8:42pm -07:00

    Seriously Simple Podcasting is a great Wordpress plugin for hosting your own podcast. However, the number of episodes it includes in the RSS feed is tied to the number of items you include in all your other RSS feeds as well. 

    If you want your podcast listeners to be able to find old episodes, you'll need to make sure that all your podcasts appear in the feed all the time, since some clients like iTunes will treat your RSS feed as authoritative, and delete episodes from their directory when they disappear from the feed.

    This podcast has almost 30 episodes, not just 10!

    Tonight I took a plunge into the WordPress plugin to see why it was only including the last 10 posts. It turns out the plugin will actually show the number of posts of all your other RSS feeds, and also provides a hook to override that. Why they don't expose that in the interface I don't know.

    So, rather than making all your RSS feeds show all of your posts (which will slow down your site and use a lot more of your server's bandwidth), I wrote a plugin that overrides the default setting!

    Download this plugin to your plugins folder, and enable it in the WordPress panel! You can download the zip linked below, or just copy and paste this super tiny amount of code into a file in your plugins folder.

    Download Zip
    <?php
    /*
    Plugin Name: Include All Podcast Episodes in RSS Feed
    Plugin URI: https://aaronparecki.com/ssp-include-all-episodes-in-rss-feed
    Description: This plugin enables displaying all podcast episodes in the Seriously Simple Podcast feed
    Version: 1.0
    Author: Aaron Parecki
    Author URI: https://aaronparecki.com
    License: CC0
    */
    
    add_filter('ssp_feed_number_of_posts', 'ssp_modify_number_of_posts_in_feed');
    
    function ssp_modify_number_of_posts_in_feed( $n ) {
      return 10000;
    }
    

    Note that after you install the plugin, you'll have to change something about your podcast (upload a new episode, change the description of an episode, whatever) in order to get the plugin to regenerate the RSS feed since it seems to cache it.

    Portland, Oregon
    anomalily.net
    2 likes
    #wordpress #podcast
    Mon, Jul 24, 2017 8:42pm -07:00
  • IndieWeb Summit 2017 Attendees

    IndieWeb Summit 2017 Wrap-Up!

    Fri, Jun 30, 2017 1:13pm -07:00

    Thanks to everyone for coming to IndieWeb Summit 2017! We had a fantastic and productive weekend!

    This year was the best documented IndieWeb event yet! Thanks to everyone who contributed to documenting the sessions and demos!

    Saturday

    We started off with a few keynotes, videos of which you can find below:

    • What is the IndieWeb (6min) - Aaron Parecki
    • State of the IndieWeb (15min) - Tantek Çelik
    • A Brief History of My Website (19min) - Lillian Karabaic
    • Indie Map (26min) - Ryan Barrett

    We then continued with everyone in the room introducing themselves in under a minute each, and demoing something on their website. (Video 32min)

    We had a variety of unconference sessions on Saturday, all of which have video archives as well as thorough notes! You can find the sessions and videos linked from the Saturday schedule page!

    Sunday

    On Sunday we started with two workshops: 

    • Intro to IndieWeb Building Blocks, a quick overview of Microformats, Webmention, WebSub and Micropub
    • IndieWebifying your WordPress

    After an afternoon of creating things, we ended the day with demos! We even have a full transcript and video of the demos!

    Photos and Blog Posts

    You can also find more photos from the weekend, and a few people have already written some great wrap-up posts on their websites!

    • IndieWeb Summit 2017 Recap by Jonathan LaCour
    • Martijn van der Ven
    • IndieWeb Appreciation by Gregor Morrill

    More IndieWeb Summit wrap-up posts will be posted on IndieNews as they happen!

    Thanks to our Supporters

    I'd like to give a huge thanks to all of our supporters who make this possible! Mozilla, Bridgy, Donut.js, CSVConf, as well as our monthly supporters on our Open Collective page, and everyone who contributed specifically for this event during registration!

    Stay in Touch!

    If you haven't yet, join our Slack room or IRC channel (they're connected) and introduce yourself! We stay in touch year-round to continue the excitement and the work we're doing, as well as organize future events!

    Speaking of future events, we have more events coming up!

    • Homebrew Website Club on July 12 in Berlin, Nürnberg, Brighton, London, Baltimore, Bellingham, San Francisco and Portland
    • Check our Events page for future Homebrew Website Club meetups as well! They happen every 2-4 weeks in several cities!
    • IndieWebCamp Berlin - November 4th and 5th
    Portland, Oregon
    6 likes 1 mention
    #indieweb #indiewebcamp #iws2017
    Fri, Jun 30, 2017 1:13pm -07:00
  • Planning for Data Access when Choosing a QS Tool

    Tue, Jun 27, 2017 11:29am -07:00

    Below are notes from my workshop at the Quantified Self 2017 conference in Amsterdam. Thanks to everyone who participated in the workshop!


    These are some criteria I use when determining whether a tool will actually be useful to me, and whether I want to invest money and time in the product.

    Effort vs Value

    How much active effort does it take to use? On a scale of:

    • Completely automatic
      • FitBit is always on, tracks sleep automatically
    • Easily accessible
      • Jawbone UP requires a button press to track sleep, but is always on my wrist so it's easy
      • Withings scale requires no additional effort to upload/sync after stepping on the scale
    • App on a phone
      • Requires extra work to unlock the phone and find the app
      • e.g. Sleep Cycle app requires launching the app and setting the phone on your bed
    • Manual data entry
      • e.g. in a paper notebook, or manually adding to Google Sheets

    Once you determine the level of effort the tool requires, ask yourself whether you are willing to put in that effort. Sometimes the value of a tool is high enough that you're willing to go to great lengths to use the tool.

    Syncing

    How does the device sync? Does it require that a company run servers? Does the device download directly to your own computer?

    For example, the Withings scale only talks to the Withings servers. You can download the data from their servers later.

    The Eye-Fi card can download directly from the card to your computer, which doesn't require a third party service to be running in order for the device to work.

    Exportability

    What kind of export options are available from the tool? Planning on the service at some point shutting down or planning on moving to a new tool later will mean you will want to choose a tool that ranks higher on this list.

    • Full data portability - The tool provides a complete export and import, and you can expect to be able to import the data into another tool later. Almost no services support this, likely because there aren't very good standards defined for what the raw data should look like.
    • API to export data continuously - The tool provides an API that exposes the data and can be used to connect with things like IFTTT.
    • Export in machine-readable format - The tool does not provide an API, but does allow manual exporting in a machine-readable format. Things like exporting CSV files, JSON files, etc. This is the minimum bar a service must hit before I will consider using it.
    • Export in non-machine-readable format - The tool provides an export, but it is in a format that is not machine readable, e.g. PDFs or images.
    • No data export - The tool does not support exporting data at all.

    How does the tool make money?

    There are several ways tools make money, and you should be aware of the profit motive of any tool you use. There is no "right answer" and no "wrong answer" for whether you should use a tool based on this, but it's important to be aware of.

    Did you pay just one time to buy a device?

    • Things like FitBit charge exponentially more for the device than the manufacturing of it costs, because they don't charge you for membership later. 
    • If you've paid only once but continue to cost the company money because they have to maintain servers for the device to work, then the company will likely try to sell you a new device in order to continue making money from you.
    • Customer acquisition is often a huge cost to companies, and if each customer only pays once (how many wifi scales do you really need), then each customer is not going to be very profitable especially if the customers also incur costs on the company.

    Are you paying a monthly subscription?

    • If you are paying monthly for a tool, then chances are the company is working in your best interests since they will want to retain you as a customer.

    Is the company making money off your data?

    • Often times a company will be making money off the data they collect from their customers. The data can be useful in aggregate, so the company might want to encourage lots of people to use the tool to be able to collect enough data to make it valuable.
    • This is not necessarily bad, but you need to be aware of this tradeoff.
    • I've never paid any money to Foursquare, but I continue to use the app and allow them to use my data because I get a lot of value out of their tool.

    Battery Life

    For portable devices, battery life is often a huge concern. People often have different preferences for this as well. Some people are willing to charge a device every night, others want to not think about it for months.

    My FitBit battery lasts around 5-7 days. That's enough to go out of town for the weekend and not worry about bringing my charger. I also can't charge my FitBit at night, since I wear it to track my sleep.

    Related to battery life is what connector the device uses to charge. Is it a proprietary connector? That's usually less ideal since it can be more expensive to replace, and is harder to borrow chargers if you need. Thankfully most devices are moving towards charging via Micro USB or USB-C, so it's usually not too hard to find connectors.

    Competition

    Another thing to keep in mind is whether the tool you're considering has competition in the market. Are there other options you can switch to if this service shuts down, or if your device breaks?

    Are there other services that provide this or similar functionality? If not, you might end up "stuck" as a customer, and the company will have little incentive to improve things or reduce costs.

    Longevity

    What will happen to the device if the company shuts down?

    A Jawbone UP is completely tied to the company existing. The band has no interface, and requires syncing with their iOS or Android app to continue working. If the company shuts down their servers and unpublishes the app from the stores, the device will be nothing more than a decorative bracelet.

    A Withings scale will still be a scale if the servers shut down, since it actually has a visual display on the device.

    Portland, Oregon
    1 like 2 replies
    #qs #qs17 #quantifiedself
    Tue, Jun 27, 2017 11:29am -07:00
  • IndieWeb Summit 2017 Attendees

    Your final updates for IndieWeb Summit

    Thu, Jun 22, 2017 12:01pm -07:00

    Hello! Only a couple more days before we all come together for IndieWeb Summit! I am very much looking forward to the weekend, and I hope you all have a fantastic time while you're in Portland.

    Our guest list is all full up! If you won't be able to make it this weekend, please reply to this message and let me know. That will free up some space for people on the waiting list.

    Here are a few last pieces of information before we meet. You'll probably want to save this update so you can refer to it as you make your way to Portland for the festivities.

    Weather

    The good news is there's almost 0% chance it will rain this weekend. The bad news is it's going to be one of the hottest weekends of the year. The forecast is predicting a high of 95°F on Saturday and 98°F on Sunday! It does cool down to the low 60°s in the evenings, so keep that in mind as you're packing.

    weather forecast for Portland

    Breakfast and Lunch

    We will be providing hot breakfast on Saturday and Sunday mornings thanks to our sponsors, and we'll have coffee available all day as well!

    Lunch on Sunday will be provided at Mozilla. We'll have a taco bar with gluten free, vegetarian and vegan options! Saturday lunch will be on your own at any of the nearby food carts.

    Friday Pre-Party

    On Friday evening, we'll be hosting a pre-party at Pine Street Market in downtown Portland starting at 5:30pm. Pine Street Market has a variety of food and drink options, including burgers and veggie burgers, ramen, breads and pizzas, as well as Salt & Straw ice cream, excellent coffee, and fantastic cocktails.

    We'll have drink tickets for everyone who has registered, but anyone is welcome to come to the pre-party!

    Getting Around

    Mozilla Portland is at 1120 NW Couch St. on the third floor. There will be a building security guard who will let people in in the mornings and can direct you to the right floor. The doors open at 9:00am on Saturday and Sunday.

    Public Transit

    There is a streetcar that runs North/South along 10th and 11th Avenues that will drop off right at Mozilla. You can buy a ticket with a credit card at any streetcar stop, or buy a ticket with cash on board. The fare is $2.

    TriMet runs the buses and light rail here. The cheapest way to get into town from the airport is to take the MAX red line which picks up at the airport and drops off at Pioneer Square downtown. It's $2.50 for a 2.5 hour ticket or $5 for an all-day ticket. Your TriMet ticket will also let you ride the streetcar!

    Remote Participation

    We will have people participating remotely who couldn't come to Portland this year! The main room will be set up for remote participation with cameras and screen sharing. More information and the relevant links are on the wiki.

    Join the Chat

    If you haven't already, join our Slack room or IRC channel (they're connected) and introduce yourself! We'll be using the chat during the event to take notes, share links, and communicate with the remote participants.

    Code of Conduct

    As a reminder, we have a Code of Conduct that applies to IndieWeb spaces both online and offline. Since the IndieWeb Summit is hosted at Mozilla, their Community Participation Guidelines apply to this event as well.

    Thanks to our Supporters

    I'd like to give a shout-out to our sponsors who make all of this possible! A huge thanks to Mozilla, Bridgy, Donut.js, CSVConf, as well as our monthly supporters and everyone who contributed specifically for this event!

    That's all for now! Looking forward to seeing everyone on Saturday!

    -- aaronpk

    Portland, Oregon
    1 like 1 reply 1 mention
    #indiewebcamp #indiewebsummit
    Thu, Jun 22, 2017 12:01pm -07:00
  • For Sale: 1910 Charnelton St

    Fri, Jun 2, 2017 12:32pm -07:00

    The listing for my house in Eugene just went live! If you know anyone looking for a house in a great location, feel free to send them this listing!

    http://www.barbarawest.com/listing/OR/Eugene/1910-Charnelton-St-97405/58441596

    Portland, Oregon
    #charnelton
    Fri, Jun 2, 2017 12:32pm -07:00
  • Micropub is a W3C Recommendation

    Tue, May 30, 2017 9:00am -07:00

    I'm excited to announce that Micropub is now a W3C Recommendation! It's been a long road, but we made it! This is the final stage in the W3C spec lifecycle, and means that the spec has gone through several stages of review, and all parts of the spec have been implemented by at least two people.

    Incubation and Selfdogfooding

    Micropub began in 2013 when I outlined a simple API to create blog posts and short notes for my website, and published it on the IndieWeb wiki. I implemented it on my server as well as in a few new client applications, and was quickly using it day-to-day. The main design goal with Micropub is that it should be easy to implement, and it should build on top of existing standards: OAuth 2.0 and the Microformats 2 vocabulary.

    Micropub is also intended to be implemented incrementally. Rather than having to read, understand, and implement the whole spec, you can start by implementing the basics of just creating simple text posts. You can later expand your implementation to support more complex objects, as well as editing and deleting posts.

    Implementations and Interoperability

    One of the benefits of supporting Micropub on your server is that it allows you to leverage other peoples' work in building an interface to create posts on your own website. By 2014, there were already six independent server implementations, and four client implementations other than my own.

    Over the next years, more and more people built out Micropub support in their blogging systems, including plugins for WordPress, Known, Kirby and more. Many client implementations have popped up as well, for a variety of platforms including the Micro.blog iOS application, a Ruby web app, an XMPP chat bot, and more. I also wrote a Micropub proxy service OwnYourGram which imports your Instagram photos to your website, and another called OwnYourSwarm that does the same for Foursquare checkins.

    Micropub at the W3C

    In 2015, the Social Web Working Group decided to adopt the Micropub spec to fulfill the client-to-server API aspect of our charter. 

    After taking the initial spec from the IndieWeb wiki and writing it up in W3C format, I brought it to the working group as an Editor's Draft. After working on it within the group for a few months, we resolved to publish it as a Working Draft in January 2016. 

    W3C Spec Lifecycle

    Publishing a Working Draft is typically the first time the spec will be noticed by other groups within the W3C, as it's the first time a working group signals that a spec is intending to reach Recommendation status. Micropub went through 5 revisions of Working Drafts from January to July 2016, gathering feedback from implementers and incorporating it into the spec. The changes ranged from editorial clarifications, to changes that affect the implementations such as names of properties or response formats. You can see all of the changes in the Micropub working drafts here.

    At the point the working group is satisfied that the spec is in good shape, the group can request that the spec be promoted to a Candidate Recommendation. To do so, the group must demonstrate that the spec has received "wide review" from people outside of the working group. We used GitHub Issues to get and track feedback, to make it easy to show the feedback received and how any feedback was incorporated into the spec. Publishing a Candidate Recommendation requires the approval of the W3C director.

    Test Suite and Implementation Reports

    The spec then stays in the Candidate Recommendation stage for a minimum of four weeks. This time is used to gather feedback from other W3C member companies, as well as gathering implementation reports from anyone interested in the spec. Implementation reports are meant to collect information on which parts of the spec are being implemented, in order to get a sense of how mature the spec is and how well it's written. While the implementation report may look like a checklist you have to complete, it's totally fine to submit it with only a few boxes checked. It's more of an evaluation of the spec than an evaluation of your implementation.

    In the W3C Social Web Working Group, we set an intentionally high bar for our specs to graduate out of Candidate Recommendation. In order to be promoted from Candidate Recommendation to Proposed Recommendation, our group decided that each feature of the spec must be implemented by at least two independent implementations. This helps ensure that the spec really does encourage interop between implementations.

    I created a test suite, micropub.rocks, that you can use to test your client and server implementations. The test suite helps you fill out the implementation report, and also is a great tool for debugging your application as you're building it out.

    micropub.rocks

    If you're building a server, micropub.rocks will pretend to be a client and will send you requests that hit all the edge cases of the spec. If you're building a client, you can use micropub.rocks as a server to test how well your client can create and edit posts.

    Proposed Recommendation

    At the point the working group has determined that all features of the spec have been implemented, and that the spec has been reviewed by people outside the working group, the group can decide to request that the spec be promoted to Proposed Recommendation. The spec can't have any substantive changes between Candidate Recommendation and Proposed Recommendation, but things like typo fixes are okay. The W3C director must approve the request to transition to Proposed Recommendation. The Proposed Recommendation step is one last chance to broadcast widely that the W3C believes the spec is ready to publish as a Recommendation, and gives the W3C Advisory Committee an opportunity to formally object to any aspect of the spec.

    Micropub reached Proposed Recommendation status in April 2017.

    Recommendation

    The final step is to publish the spec as a W3C Recommendation. In order to be promoted from Proposed Recommendation to Recommendation, the W3C Advisory Committee and the Director must approve the request. Once a Recommendation, the spec is finished, and can only be changed by submitting Errata.

    Getting Started

    The micro.blog iOS application posting to my website via Micropub

    In addition to the test suite, there are already many clients and servers you can try out yourself! We've been documenting Micropub clients and servers on the IndieWeb wiki, so have a look around!

    If you're building a blogging platform, you can allow your users choose from a wide variety of posting clients by implementing the Micropub spec.

    If you're building a posting client and want it to work with many different server backends instead of hard-coding it to Twitter or other proprietary APIs, implement the Micropub spec and you'll quickly have people eager to start using the app!

    I hope this post has been a useful overview of what I've been working on for the past several years, and gives you a bit of a sense of what it's like to work on specs at the W3C!

    74 likes 33 reposts 1 bookmark 10 replies 11 mentions
    #w3c #micropub #specs #indieweb
    Tue, May 30, 2017 9:00am -07:00
  • I'm Still Here!

    Fri, May 19, 2017 6:31pm +02:00

    I just added my favorite new relatively unimportant feature to my website!

    I added my Swarm checkins to my website a few weeks ago, which has been super fun. One thing I always thought was missing from Swarm was knowing whether a friend was still at a venue they checked in to. Since I have checkins on my own website, I can do that now!

    So now, when you're viewing a checkin on my site, if I am still at that location, you will see a pulsing blue dot next to the venue name!

    This works because my phone always tracks my GPS location and reports it to my website already. My website compares my last location with the last checkin, and if I'm still nearby, will show the dot!

    Nürnberg, Bayern
    1 like
    #p3k #indieweb #checkins #gps
    Fri, May 19, 2017 6:31pm +02:00
  • Maniac Morning May 4

    Thu, May 4, 2017 9:27pm -07:00

    I am on the fence about whether to count today's sprint towards this goal. I started this morning by diving into catching up on accounting in my 3 Quickbooks files, which is something I've been trying to get better about doing regularly since I finally filed my taxes on time for the first time in 6 years. I spent a little over 2.5 hours on the accounting, and managed to get all caught up. (Of course I didn't screenshot that for obvious reasons.) So on that front, it was a successful morning. However, it was also over 80 degrees today which is way hotter than my comfort level and we don't have A/C in the apartment. This led to a relatively unproductive afternoon, and I never got enough energy up to do anything more creative. So on that front, I didn't make progress on my more interesting projects today. Overall, at least I have another data point to reaffirm that I am definitely more productive in the mornings!

    Portland, Oregon
    #mmm
    Thu, May 4, 2017 9:27pm -07:00
  • Maniac May Mornings

    Mon, May 1, 2017 2:34pm -07:00

    At the end of March, I finished my #100DaysOfIndieWeb and #100DaysOfMusic projects. It was a lot of fun to focus on those during the winter months, but I'm also glad to free up my days now that they are done. It turns out 100 days is a lot of days, and I don't think I'll do such a hard #100days project again. However I did enjoy the focus I was able to achieve, and enjoyed knowing every day what I needed to work on. So for the month of May, I'm starting a new, slightly different experiment.

    Along with @anomalily and @beeminder, I'm dedicating May to "Maniac Mornings". We've done some "Maniac Weekends" in the past, which is essentially a 2-day sprint trying to cram in as much focused time as possible, and is a shorter version of "Maniac Week" which is like locking yourself in the basement for a whole week with no outside distractions. With my upcoming travel to IndieWebCamp Düsseldorf and Nürnberg, and with the goal of not feeling completely overwhelmed or burnt out at the end, "Maniac Mornings" is an attempt to scale back the idea to a sustainable level.

    For the month of May, the goal is to spend 2+ hours every morning on focused and high-impact work, with no distractions. High-impact work is things that have a direct impact on my goals, which at the moment are getting the WebSub spec through the W3C process, updating and printing my OAuth 2.0 book, and a new as-yet-unnamed secret project.

    Part of the "Maniac" aspect is that I will also be recording a timelapse of my computer screen including a picture from my webcam!

    I'll try to post these every day, and I'll also stitch them together at the end of the month into one long video!

    The large clock on the screen is a small Electron app that embeds a tiny bit of Javascript and CSS to render the clock. It also sets a background timer that runs the OSX "screenshot" command every 30 seconds, and saves the images to a folder. You can download "Maniac Clock" here!

    Portland, Oregon
    #maniac #mmm #productivity
    Mon, May 1, 2017 2:34pm -07:00
  • tantek https://github.com/tantek   •   Jun 22

    #8 Need use-cases section

    Some thoughts on the XRay and jf2 JSON formats

    Mon, Apr 24, 2017 8:59pm -07:00

    Since beginning the jf2 spec, I've continued developing XRay, and its format has diverged from the original jf2. Tonight I spent a while trying to reconcile the changes to submit a PR to the spec. I was unable to come up with a short PR, and instead got drawn in to thinking about the motivations behind a simpler mf2 JSON format to begin with.

    I use XRay in a number of projects for various purposes.

    • My website runs every external URL through XRay to handle consuming the Microformats on the page, converting it to a simplified form. This is used whenever I reply to a post to display the reply context, as well as to fetch the post contents when I make a repost.
    • Loqi uses XRay to create a one-line summary of URLs pasted into IRC.
    • webmention.io uses XRay to parse the source URL of webmentions to extract useful data about the webmention, and makes this data available via an API.
    • IndieNews uses XRay to parse submitted URLs to display the name and author of the posts.
    • Quill uses XRay to show a preview of in-reply-to URLs.
    • My rudimentary reader uses XRay to extract the h-entry data from posts to display in my reader.

    There are a number of things that XRay does when extracting the mf2 data.

    • Finds the author of a post following the authorship algorithm
    • Follows the comments presentation algorithm to remove the name property if it's a duplicate of the content.
    • Figures out the primary object on the page, or whether the page represents a list of posts, which is sometimes tricky. (some discussion on representative object)
    • Is vocabulary-aware, so always returns a consistent set of properties, and doesn't return unknown properties. e.g. published is always a single string, and category is always an array.
    • Sanitizes all HTML, allowing only a small subset of HTML tags and Microformats classes on the HTML elements.
    • For any values that might be embedded objects, e.g. a person-tag or in-reply-to property, always returns the URL in the value and moves the embedded object to a refs object, making it easier to consume.
    • The author property is a simplified h-card containing only name/photo/url properties that are single values.

    As you can see, a lot of what XRay is doing is cleaning up some of the the "messy" parts of Microformats JSON. Not necessarily the specific JSON format, but more about the overall structure, such as how an author of a post can be in many different places in a parsed Microformats JSON object. This is not to place blame on Microformats, since what it's doing is creating a JSON representation of the original HTML, and allowing authors flexibility in how they publish HTML rather than prescribe specific formats is a core principle.

    What this means is XRay is actually acting more as an interpreter of the Microformats JSON, in order to deliver a cleaned-up version to consumers. Most of my projects that use XRay could actually be considered "clients", such as how I use XRay to parse posts for my reader, whether that's output to me in IRC or re-rendered as a post on IndieNews.

    My primary need for an alternative Microformats JSON format is actually a client-to-server serialization, where the client is getting a cleaned up version of external posts, and can assume that the server it's talking to is responsible for taking the messy data and normalizing it to something it expects. In this sense, the use case of jf2 is a client-to-server serialization, whereas the Microformats JSON is a server-to-server serialization. This would then be a core building block for Microsub, a spec that provides a standardized way for clients to consume and interact with feeds collected by a server.

    The main current challenge in defining a spec for this use case is how tied to specific vocabularies it should be. For example, Microformats JSON says that every value should always be an array. However, there are a few properties for which it never makes sense to have multiple values, and creates additional complexity in consuming it, e.g. published, uid, and location. It's easier to consume these when the values can be relied upon to always be a single value. With the author of a post, the author of an h-entry may be an object or a string, making it more complicated to consume that when it can vary, so XRay's format always returns a consistent value. However this is tied to the h-entry vocabulary, since other Microformats vocabularies don't have an author property. In general, the success I've had with XRay's format is due to the fact that it makes hard decisions about what properties it returns, and is consistent about whether those properties are single- or multi-valued, in order to provide a consistent API to consumers.

    I am just not sure how to balance wanting to provide that simplicity for consuming clients while also allowing flexibility in publishing, while also not hard-coding too much into a spec that might be obsoleted later.

    Portland, Oregon
    1 mention
    #jf2 #xray #indieweb
    Mon, Apr 24, 2017 8:59pm -07:00
  • Homebrew Website Club PDX 2017

    Mon, Apr 3, 2017 3:48pm -07:00

    I'm super excited to announce that DreamHost will be hosting our Homebrew Website Club PDX meetups for the rest of 2017! We've got all the dates planned out, so put them on your calendar! 

    • April 5th
    • May 10th
    • June 7th
    • July 12th
    • August 2nd
    • September 13th
    • October 4th
    • November 8th
    • December 6th

    The meetups are from 5:30-7:30pm at the DreamHost Portland office, at 621 Southwest Morrison St, 14th floor.

    You can see all the events on the IndieWeb wiki and on Calagator.

    Portland, Oregon
    #indieweb #hwc
    Mon, Apr 3, 2017 3:48pm -07:00
  • Day 100: A Website for my 100 Days of Music #100DaysOfIndieWeb

    Thu, Mar 30, 2017 2:42pm -07:00

    I wanted a way to quickly browse and share songs from my 100 Days of Music, so I thought I would make a page with links to each song. Clicking on any of the tracks opens up a video player with the description of the song.

    I am curious to find out which songs people like the best, so to start with, I added some Google Analytics code to the page. I track events for each time a video is started, paused, when the video finishes playing completely, and if a video was interrupted by starting another. Hopefully this will provide some interesting data over time.

    I want to add a more robust feedback mechanism, possibly even a simple "heart" button people can click, but I'm not sure how I want to handle that yet so that will have to wait until later.

    I've made download links available for each track, in case you want to use these songs in your own projects! At the bottom of the page you'll see the copyright notice. I've decided to make these all available via a Creative Commons Attribution license, so feel free to use them for various projects! All I ask is that you let me know when you've used a song, preferably by writing a post about it and linking to the track on my website! That way it will show up as a comment on my post, like Marty's podcasts!

    https://aaronparecki.com/2016/12/29/21/day-9

    The URL of the website is:

    100.aaronparecki.com

    and because emoji are fun, I made an emoji subdomain redirect to it as well, for no practical purpose:

    💯🎶.aaronpk.com
    Portland, Oregon
    1 repost 1 reply 1 mention
    #100daysofindieweb #100daysofmusic
    Thu, Mar 30, 2017 2:42pm -07:00
  • Day 99: Making Micropub implementation reports more discoverable #100DaysOfIndieWeb

    Wed, Mar 29, 2017 9:46pm -07:00

    The Micropub implementation report summaries had gotten kind of scattered around various URLs, so today I cleaned it up and consolidated everything. I also added the number of submitted reports to the home page, along with links, so that they are much easier to find.

    I am making the report summaries all live under micropub.rocks, rather than be split between micropub.rocks and micropub.net. I updated the URL structure for the summaries on micropub.rocks to be more consistent:

    • https://micropub.rocks/implementation-reports/servers/
    • https://micropub.rocks/implementation-reports/clients/

    The corresponding URLs on micropub.net now redirect to micropub.rocks. I also added a header bar on the spreadsheet views so that you can navigate back to the home page as well as to the other set of reports while viewing the spreadsheet.

    Hopefully this makes things a little easier to find!

    Portland, Oregon
    1 mention
    #100daysofindieweb #micropubrocks
    Wed, Mar 29, 2017 9:46pm -07:00
  • Day 98: Importing Past Checkins with OwnYourSwarm #100DaysOfIndieWeb

    Tue, Mar 28, 2017 4:58pm -07:00

    I normally don't like to launch a feature that's this rough around the edges, but I decided to anyway. I added a section to the OwnYourSwarm dashboard that will let you import a specific checkin by its Foursquare checkin ID. 

    When you click "Import", the processing flow for that checkin is started in the background. There is unfortunately no feedback in the UI on its progress yet, but in a couple of seconds you should see the checkin appear at your website. Shortly after, any coins, likes, and comments are sent via Webmention as well.

    This is mostly laying the groundwork for adding the ability to backfill checkins that were made via the "offline" checkin feature, as well as the ability to do a mass import of your checkin history.

    In the mean time, you can at least enter a checkin ID manually to trigger an import if any were missed because they were "offline" checkins.

    Portland, Oregon
    1 reply 1 mention
    #100daysofindieweb #ownyourswarm #swarm
    Tue, Mar 28, 2017 4:58pm -07:00
  • Day 97: Updated Known's Micropub Support #100DaysOfIndieWeb

    Mon, Mar 27, 2017 3:33pm -07:00

    I think this is the first time in the 100days project that I've worked on a project that is not my own! Today I added support for JSON requests to Known's Micropub endpoint. I also added support for JSON checkins that OwnYourSwarm sends.

    I tried writing as little code as I could, and changing as little as possible about how it worked, so essentially I am just extracting the properties it knows about from the JSON request to the variables the plugin expects to find. This does mean that a few Micropub JSON features are still not supported, such as sending HTML content (Known seems to strip HTML tags from all content sent to it), and Known doesn't provide a mechanism for storing arbitrary nested JSON objects. However, I was able to get it to pass tests 200, 201 and 203 from the micropub.rocks test suite, which is enough for basic support. 

    It also is able to create checkins from the payload that OwnYourSwarm sends! I also made it download the photo that is attached to a checkin, rather than hotlink the Foursquare image URL.

    Since I don't have commit access to the Known repo, I sent a pull request to Known with these changes. I tested everything with a local Known installation. Hopefully benwerd or mapkyca can merge the PR soon!

    Hopefully this improves people's experience using tools like OwnYourSwarm and OwnYourGram with Known!

    Portland, Oregon
    2 likes 2 replies 1 mention
    #100daysofindieweb #known #micropub #ownyourswarm
    Mon, Mar 27, 2017 3:33pm -07:00
  • Day 96: Documentation for OwnYourSwarm #100DaysOfIndieWeb

    Sun, Mar 26, 2017 1:52pm -07:00

    Today I wrote up documentation on OwnYourSwarm. It actually took quite a bit longer than I expected to write everything up. The documentation walks through each component:

    • Authentication
    • Checkins
    • Photos as a Micropub update
    • Receiving coins via Webmention
    • Receiving backfed likes and comments

    Rather than repeat any of the information here, I will just send you off to read the docs! Please let me know if you have any questions! I hope to see some more implementations of people receiving checkins via Micropub soon!

    Portland, Oregon
    1 like 1 mention
    #100daysofindieweb #micropub #ownyourswarm
    Sun, Mar 26, 2017 1:52pm -07:00
  • Day 95: Backfeeding Comments and Likes from Swarm #100DaysOfIndieWeb

    Sat, Mar 25, 2017 10:45am -07:00

    I'm pretty excited to say that OwnYourSwarm is now backfeeding likes and comments from Swarm checkins!

    Thankfully, the Foursquare API is well documented, and has quite reasonable rate limits. It also seems to have a well-documented change policy, so is unlikely to arbitrarily change out from under me. I'm hoping this backfeed feature will be relatively stable.

    Like bridgy, I implemented proxy pages for individual likes and comments on Swarm. The page is marked up with h-entry, and includes the author name, photo, URL, as well as the comment text. Swarm also has the ability to send "sticker comments", which I render as an <img> tag in the comment body. 

    Regular comments look like you'd expect.

    Likes look similar, and have fallback text in the comment body.

    I took advantage of specific behavior I've seen on my checkins in order to build a polling schedule that won't overload my server. For the most part, people only like and comment on recent checkins. After a couple days, a checkin is unlikely to get any new comments.

    When a new checkin is posted, the user's polling interval is reset. OwnYourSwarm will check for new responses after 30 seconds. If none are found, it will wait 60 seconds, then 2 minutes, 5 minutes, 30 minutes, 1 hour, then finally a few more long-term tiers: 1 day, 2 days, 7 days, 14 days, 30 days. Of course as soon as you post a new checkin, your polling interval will be reset to 30 seconds and will start the cycle over. This hopefully will provide a good balance between quickly sending feedback for recent checkins, while also finding feedback on older checkins as well.

    The nice thing about the Foursquare API is they provide an endpoint for retrieving the last N checkins for a user, and the data returned includes the number of likes and comments. This means I only need to hit one API endpoint to retrieve the last 100 checkins and can tell if there is new activity on any of them. I then make another API request to retrieve the checkin details only when there are new comments.

    Portland, Oregon
    3 replies 1 mention
    #100daysofindieweb #ownyourswarm #swarm #backfeed
    Sat, Mar 25, 2017 10:45am -07:00
  • Day 94: Webmentions for Coins in OwnYourSwarm #100DaysOfIndieWeb

    Fri, Mar 24, 2017 2:01pm -07:00

    OwnYourSwarm will now send webmentions for all the coins that Swarm awards to your checkins!

    Here's a checkin on Swarm:

    Here's how it looks on my website:

    OwnYourSwarm creates a web page for each coin award on your checkin, then sends a webmention to your post!

    Here's what one of the comments above looks like on the OwnYourSwarm web page:

    Of course it's marked up with the Microformats2 h-entry, so that my website can parse out the icon, text and number of coins!

    To get my website to recognize the number of coins awarded, I used a vendor-specific Microformats2 property, "p-swarm-coins". Based on the recommendations for vendor-specific properties in Microformats2, I chose the prefix "swarm" and the property "coins".

    This is consumed by p3k, and added to a new property of the comment. I had to also add support for this property to XRay and webmention.io since they are part of the chain of how I receive comments.

    Now I'm excited about getting points on Swarm again!

    Portland, Oregon
    2 replies 1 mention
    #100daysofindieweb #checkins #swarm #ownyourswarm
    Fri, Mar 24, 2017 2:01pm -07:00
  • Day 93: Polling for Photos in OwnYourSwarm #100DaysOfIndieWeb

    Thu, Mar 23, 2017 11:51am -07:00

    An interesting feature of the Swarm app is how it handles photos uploaded to checkins. If you check in and attach a photo, the checkin is actually created before the photo is uploaded. If you're on a spotty Internet connection, you'll see this because your checkin will exist and you'll get points for it, but there won't be a photo yet. The app will then continue to upload the photo separately, retrying if it fails. This is actually a really great app design on the part of Foursquare, but does lead to some tricks with the API.

    Since OwnYourSwarm uses Foursquare's realtime API, it will receive a POST request almost immediately, often before the photo exists at the API. This means the initial Micropub request might be missing the photo.

    Today I made OwnYourSwarm send a Micropub update request to update your post after the photo is uploaded. When you post a checkin, if there is no photo, then OwnYourSwarm queues a background job on a 15-second delay. It will then check after 15 seconds to see if a photo exists, and sends an update request with the photo URL if so. If still no photo is found after 15 seconds, it will wait another 30 seconds and try again. This continues for the following schedule: 15 seconds, 30 seconds, 1 minute, 2 minutes, 5 minutes, 10 minutes, 30 minutes, 1 hour. We'll see if this is too much polling, but the rate limits on Foursquare are relatively high. (500 requests per user per hour). This does mean that every checkin with intentionally no photo will be requested from Foursquare 8 times.

    I had originally planned on using this same polling schedule to later pull back responses to your checkins (likes, and comments), but Ryan pointed out that I can probably use a simpler and more efficient polling schedule since the Foursquare API provides a method to return the last N checkins.

    Portland, Oregon
    1 like 1 mention
    #100daysofindieweb #micropub #ownyourswarm #foursquare #swarm
    Thu, Mar 23, 2017 11:51am -07:00
next

Hi, I'm Aaron Parecki, co-founder of IndieWebCamp. I maintain oauth.net, write and consult about OAuth, and am the editor of the W3C Webmention and Micropub specifications, and co-editor of WebSub.

I wrote 100 songs in 100 days! I've been tracking my location since 2008, and write down everything I eat and drink. I've spoken at conferences around the world about owning your data, OAuth, quantified self, and explained why R is a vowel.

Follow
  • IndieWebCamp Founder
  • W3C Editor
  • OAuth 2 Simplified
  • W7APK

  • Sunshine Indie Pop
  • These are a few of my favorite things.
  • All
  • Articles
  • Bookmarks
  • Notes
  • Photos
  • Reviews
  • Sleep
  • Travel
  • Contact
© 1999-2017 by Aaron Parecki. Powered by p3k. This site supports Webmention.
Except where otherwise noted, text content on this site is licensed under a Creative Commons Attribution 3.0 License.