Interaction with foreign sites
Subscribe to arbitrary RSS/Atom feeds
For now, it seems that I can subsribe only to Fediverse sites. Although Plume exports local feeds as atom feeds for use by otheres, I cannot see a way to subscribe to arbitrary feed.
What I want it is some simple form, where I could cut’n’paste or drag’n’drop URL to any page which has
<link rel=alternate> in its header, and entries from this feed would appear in my subscirprion feed.
Cross-post to LJ/DW
Although livejournal and dreamwidth can import arbitrary RSS feeds, these feeds are not first class citizens there. So, to keep my current audience, I want that all my posts into Plume blogs I’ve configured so, would be automatically cross-posted into my account on these sites.
Probably it can be done as completely separate piece of software, where I’d enter my LJ login/password and subscribe to my blogs as if this piece of software is another fediverse node. But see next point.
Import comments from LJ/DW
LJ provides API to download all the comments in blog. Of course I want to federate discusson under my posts in one place. So, I think it is good idea to periodically poll LJ for new comments and then put them into plume insttance as comments of users from livejournal.com site (or dreamwidth.org,) pretending that these are just another fediverse nodes. Things can get more complicated,. when people write comments in LJ from their Google of Facebook accounts, because LJ would create proxy accounts named extNNNN for them, and it requires some effort to find out that
@extNNNN@livejournal.com is really
Allow commenting with OpenID or OAuth
Of course OpenID is outdated and broken by deliberate effort of Google and some other big companies. But it is still supported on some big sites, and I thing that is nice thing to let people which do not have fediverse accounts but have accounts on these sites to comment in the Plume blogs. Or rather give blog owner ability to decide whether one wants such commenters in one’s blog.
With OAuth things are more complicated. OAuth requires application to register with authentication provider. And since I don’t have Facebook Account, and have no intention to create one, I cannot just let Facebook users to comment in the my blog.
But there are OAuth integrators, such as Loginza.
Import old posts from LiveJournal/DW
I have backup of my posts in Livejournal since I’;ve first created it (thankfully, there is API to download posts). So eventually I want to move my blog with all its history to my server. This requires ability to create article with creation dates in past time and some tool to bulk load thousands of articles with hundreds of thousands comments.
With such a load sqlite wouldn’t seem to be good choice of database. So, migration path from sqlite to posgresql is also needed. I’ve some experience with migrating projects from sqlite to postgresql, and known that there are some traps there.
Although it was project which was initially developed for SQLite and moving it to postgres with its much stricter type system involved some code fixing. Plume is initally developed for both postgres and SQLite, so it might be already prepared.
Feed page with full article text
Currently feed of my blog displays only titles and subtitles of aricles. In LiveJournal
full text of post is displayed, but user has option to hide some lengthly parts using
<lj-cut> tags. Spoilers can be implemented as putting part of text into
With current internet bandwidth, even on mobilie devices, downloading of few more kilobytes of text is not a problem, the problem is scrolling of dozens of paragraphs.
More tunable theme
As far as I have read source of Plume themes, there are too few options which theme author can affect. For instance, there is only one background and foreground. At least I want to see margins painted with other color than text background, may be comments different from post text and so on.
Interaction with external crawlers.
For now plume doesn’t serve
robots.txt file at all. But it can be essential.
Suppose you are running your instance on Raspberry PI or something alike, and than Bingbot comes and brings your system on its knees.
Or you might want to not allow your blog to be indexed by big search engines.
So, there should be ability to configure robots.txt somehow. Nice web interface is optimal, but at least ability to upload it as mediafile, and then serve from
Support for external search engines
It seems that builtin search engine of Plume is quite simplistic. But there are some sophisticated open-source search engines such as Apache lucene or Xapian Omega which can be installed alongside Plume.
It should be simple to add configuration option to Plume to delegate search and indexing operations to external tool.
As of version 0.60.0 only Japanese language is specially supported by Plume. As Russian, I, of course want Russian language supported. As Russian is fleсtive language, word normalizer is needed. I’m new to Rust and don’t know how well Rust tokenizer library handles Russian world normalization. But as far as I know it is not so simple. So, may be port of snowball or something which read hunspell affix files is needed. (These two approaches are used in the PostgreSQL I’m quite familiar with).
It seems that media gallery is just plain list of all uploaded images. And when setting blog banner or blog icon I have to scroll entire list. What if I write to this blog for ten or more years and have thousands of images in my gallery?
Features listed above require lot of time to implement. I’m not sure I alone would be able to write them all. But it worth to try.