momijizukamori: (dreamsheep | styles)
[personal profile] momijizukamori
Writing this down for future documentation purposes because I don't feel like fighting with the wiki today.

1) Create a new folder in $LJHOME/htdocs/img/mood with a short-but relevant name
2) Put all the images for the new moodtheme in the new folder. Each image filename should be in the format <emotion>.<file extension>, with the emotion in lowercase, eg, 'cheerful.png'. There is a list of valid emotion names on the wiki
3) In a commandline, run $LJHOME/bin/misc/mood-maker.pl --dir <dir name from step 1> --name "<visible title for mood theme>" --desc "<short description of mood theme>". Neither the name nor the description should contain any : characters. This will output formatted text, or error if there's a file that doesn't match one of the available preset moods.
4) Take the formatted output from (3), including the line beginning with MOODTHEME, and paste it at the end of $LJHOME/bin/upgrading/moods.dat
5) Update the database, following the regular dev maintenance update steps.
momijizukamori: Grey tabby cat with paws on keyboard and mouse. The text reads 'code cat is on the job', lolcats-style (CODE CAT)
[personal profile] momijizukamori
So we've required files to be tidied using perltidy since 2019, buuuuut I am terrible at remembering to manually do it, so! I threw together a quick'n'dirty way of automatically running tidy on commit. It's based on this example from the Prettier docs (Prettier is like perltidy but for CSS/JS/HTML).

Here's the perl variant of the script:
#!/bin/sh
FILES=$(git diff --cached --name-only --diff-filter=ACMR | sed 's| |\\ |g')
[ -z "$FILES" ] && exit 0

# Prettify all selected files
echo "$FILES" | xargs tidyall

# Add back the modified/prettified files to staging
echo "$FILES" | xargs git add

exit 0


This also has the bonus that it only tidies files you added to your commit, so you don't end up with rogue changes in files you didn't touch because someone else forgot to tidy before a branch got merged!

EDIT: [personal profile] kareila rightfully pointed out that I didn't actually specify the install instructions (they're in the linked Prettier docs but, uh, that's not that intuitive). This script should go in the file .git/hooks/pre-commit, and that file needs to be executable, which you can do on the commandline by running chmod +x .git/hooks/pre-commit
mark: A photo of Mark kneeling on top of the Taal Volcano in the Philippines. It was a long hike. (Default)
[staff profile] mark

Hi all!

As of today, I've made a few fairly broad changes to the way our repositories are set up. You will need to likely make some changes to your development workflow. But! They should be simple.

First, the changes:

  1. The dw-free repository has been renamed to dreamwidth -- because,

  2. The dw-nonfree repository is no more, it has been merged into the dw-free under the path ext/dw-nonfree/.

  3. Finally, the master branch has been renamed to main.

These two changes mean that no more do we have to track and coordinate changes across two repositories, they can be done atomically in one place. Since we don't really support or test running Dreamwidth on its own (without the branding) this will also allow us to simplify development by ultimately collapsing the code, too.

To update your checkout of Dreamwidth, you will need to do two separate things. First, go to Github and to your existing dw-free repository, go to the settings and:

  1. Rename the repository to dreamwidth

  2. Go to branches and rename the master branch to main

Once that's done, you need to update your local checkout like this:

 cd $LJHOME

 # WARNING: Only do this if you have no code here that you care about!
 rm -rf ext/dw-nonfree

 # Update the repository root (for your "origin" remote, **set your username here**)
 git remote set-url origin git@github.com:YOUR_GITHUB_USERNAME/dreamwidth.git

 # Update the repository root (for your "upstream" remote, please change as needed)
 git remote set-url upstream git@github.com:dreamwidth/dreamwidth.git

 # Fetch only to get the new main branch
 git fetch

 # Switch over to it
 git checkout main

 # Get rid of local master
 git branch -d master

 # Redirect HEAD
 git symbolic-ref refs/remotes/origin/HEAD refs/remotes/origin/main


That's it; you should be good to go now. I'll work on updating the Dreamhack scripts shortly, but wanted to get these instructions out. Please let me know if you have any questions/comments/issues.

mark: A photo of Mark kneeling on top of the Taal Volcano in the Philippines. It was a long hike. (Default)
[staff profile] mark

Hi all,

I've gone ahead and run perltidy on our codebase and added a test that will run and tell you if things are tidy or not. This is now required for all PRs and code we're merging.

To set up and run it locally, first make sure your modules are up to date:

bin/checkconfig.pl

This will probably ask you to install some modules, do so! Then you can run tidyall and it will automatically reformat all of your files as necessary:

perl -I$LJHOME/extlib/lib/perl5 $LJHOME/extlib/bin/tidyall -a

The first time you run this will take a while. It caches the results though so further runs will be much, much faster.

That should be it -- let me know if this works for you or if you have any issues!

karzilla: a green fist above the word SMASH! (Default)
[staff profile] karzilla
We're about to deploy a new backend interface for file storage, called BlobStore, which [staff profile] mark wrote over the past few months with the intention of standardizing how file storage is handled in our code and making it work with any number of possible underlying technologies. It currently supports MogileFS and local disk, and we plan to add support for S3 in the future.

At this point, MogileFS is considered legacy technology. If your site is set up to use MogileFS, that configuration will continue to work under BlobStore for now. However, no new code that requires MogileFS will be accepted.

What you need to know if you are writing code: the new methods are implemented in cgi-bin/DW/BlobStore.pm and are pretty straightforward. For the most part they serve as drop-in replacements for the MogileFS file methods.

What you need to know if you are running a server: if you try to do anything related to uploading images, including userpics, you will get a fatal error unless you have defined either %LJ::BLOBSTORE or %LJ::MOGILEFS_CONFIG. So if you were already using MogileFS, you're fine, but if not, you will need to set up local disk storage in one of your local config files. The stock etc/config-private.pl in dw-free will have an example %LJ::BLOBSTORE that you can uncomment and use.

What you need to know if your existing userpics disappear: If you were running a server without MogileFS, all of your system's userpics were stored in a database table, and use of that table is no longer supported. I'm working on a new version of the migrate-userpics.pl script that can be used to move the images into your BlobStore once you've got that configured. (Update: this is available in bin/upgrading/migrate-userpics.pl.)

Obviously this will all need to be documented on the wiki somewhere, but I've got my hands full right now making sure everything is nailed down to push these changes into production in a few days. Let me know if there's anything I didn't cover here that needs to be addressed.
kareila: Rosie the Riveter "We Can Do It!" with a DW swirl (dw)
[personal profile] kareila
I am excited to announce that I have just published to the wiki a new Template Toolkit / Foundation Conversion Guide! Covering such thrilling and educational topics as:

  • Which page components belong in a controller vs. a template?

  • What do I need to do to convert a page to Foundation?

  • How is a BML file like a plate of spaghetti?

AND MORE!

As usual, let me know here if you have any questions or comments!
denise: Image: Me, facing away from camera, on top of the Castel Sant'Angelo in Rome (Default)
[staff profile] denise
If your dev environment is publicly reachable on the internet, it will get spam! I've written a quick guide to spam prevention.
kareila: "PERL!" (perl)
[personal profile] kareila
I updated http://wiki.dreamwidth.net/wiki/index.php/Dev_Testing with information about setting up the new test config files and initializing the test database. If you give it a go and run into trouble, leave a comment here. ♥
denise: Image: Me, facing away from camera, on top of the Castel Sant'Angelo in Rome (Default)
[staff profile] denise
I created the Explanations wiki page after seeing [staff profile] mark waving the code machete tonight: he took out a lot of old and unused stuff, which is great, but a few (very out of date) old server-admin documentation files that were casualties had some explanations of things that I don't think are documented anywhere else.

That does not mean the entire server-admin documentation should survive -- out of date documentation is often worse than no documentation! -- but between that and the talk I gave at LCA, "When Your Codebase Is Nearly Old Enough To Vote", it occurred to me that we should probably document a few of those things that live in our heads that we find ourselves explaining to new contributors -- things like why an entry URL uses the anum and not the jitemid, why you can't just change a text string in a patch and have it update the text on the live site, why you occasionally hear oldtimers calling the Recent Entries page 'lastn', stuff like that.

So, if you've been around for a while and can think of a few things that wouldn't necessarily make sense right off the top of your head -- or if you're new and ran into one or more of those things at some point and had to have someone explain them to you -- please add them to that page! As I said in my talk: institutional memory is a great thing to have, but a lousy thing to have to rely on.
fu: Close-up of Fu, bringing a scoop of water to her mouth (Default)
[personal profile] fu

With [staff profile] denise's help (the bulk of this was from her really!), we've made major changes to the dev-facing wiki documentation for clarity.

Among other things:

  • merged multiple/redundant pages and sections

  • improved linking

  • (hopefully) reduced the complexity of paths through the wiki for someone just getting started

The biggest change is to Dev Getting Started, which is now greatly expanded, with a much clearer flow, and more focus on someone totally new to DW/development. The resources for someone more experienced have been moved to Dev Quick Start .

The contents of Version Control have been merged with Newbie Guide: How To in Git and the latter is the canonical page for git info -- though now I'm tempted to go rename it to Version Control because it's shorter. Git How To? ;)

Git instructions in some pages have been updated to be much simpler with a pointer to the appropriate section in the git commands in case that's needed.

And the Directory Structure has been expanded to cover more subdirectories.

Beginner Dev Checklist needs some more effort to pull it apart: plan is to integrate it into other pages as appropriate and then get rid of it (since it's not sufficiently different from Dev Getting Started to warrant its own page)

Would appreciate if you poked around through the various pages and let me know if there's anything still left unclear, or if you're aware of similar pages that can be merged into these existing ones!

kaberett: A sleeping koalasheep (Avatar: the Last Airbender), with the dreamwidth logo above. (dreamkoalasheep)
[personal profile] kaberett
I've synthesised discussions about and practical use of GHI into a single wiki page. Please shout if you want anything added or changed!

I've also gone through and replaced as many references to Bugzilla with references to GHI as (1) makes sense and (2) I am immediately able. There's about 20 pages still on my hit-list, not all of which I have the knowledge to deal with - if you feel like pitching in, please take one and let me know when it's done (or, if you don't have wiki access, let me know what the appropriate edits would be).

ExpandList of pages outstanding )
rax: (Silver whaaaaaaaaaaaaaaaaaaaaaaaat)
[personal profile] rax
Hi!

I'm trying to crawl and parse comments on a community for a fandom event (http://hs-worldcup.dreamwidth.org , if you're curious). I've run into a bunch of issues and lack of API documentation, and talked to DW Support a couple of times, and feel like I am further away from successfully doing anything than when I started. Before I say anything else, here is What I Am Really Trying To Do:
  • take an individual community post (example: http://hs-worldcup.dreamwidth.org/3493.html#comments) 
  • download all of the comments with some sort of threading information --- the data I need in particular is comment subject, comment author, comment content, whether or not it's a reply and if so to what
  • parse out that data and do transformations to it and add it to a database (which is not super relevant to this question I don't think but I can go into more detail if necessary)
I looked into the API for getting comments which led me in a roundabout way to www.livejournal.com/developer/exporting.bml . I'm probably missing something obvious here, but I don't actually see how this tells me how to make an API call? It gives me a GET request, but not what to send the GET request to? Also, support told me the only action DW supports here is "Get all comments for a community," not "Get all comments for a page," and I should probably just crawl the pages. Is that what other folks have done when doing this?

If that is what I should do, how do I get around the adult content warning? Is there a flag I can pass with the URL or something? Do I need to do something more complicated than just using curl to grab the pages? Is there something I can pass to say "just give me one piece of HTML with all 5000 comments on it it will be easier for both of us probably?"

Thank you for any suggestions or advice you might have.

fu: Close-up of Fu, bringing a scoop of water to her mouth (Default)
[personal profile] fu

SCSS directory structure

All our SCSS files are in htdocs/scss/; compiled versions are automatically placed in htdocs/stc/css. We care about the former when editing, and the latter when including them in the webpage itself.

One of the things that SCSS is really good for is making it easy to organize CSS. I'd like us to move away from CSS for individual pages, and instead concentrate on having components that are used on multiple pages.

  • htdocs/scss/foundation - stylesheets from the Foundation library. The main files to touch here are foundation.scss, which specifies the components we'll be using on all pages, and _variables.scss, which sets default behavior / appearance variables that are common to all site skins.

    _variable.scss should not contain any colors, because we have to take into account dark-on-light vs light-on-dark skins. Sizes / measurements are perfectly acceptable here!

  • htdocs/scss/components - stylesheets for individual components. These are included only on the pages that need them, but contain all the styling that's required within. Everything here should have a corresponding example of the HTML structure over on /dev/style-guide

    This folder also contains Foundation components that aren't used on enough pages to warrant being included on all of them (htdocs/scss/foundation/foundation.scss), but that we still need on individual pages. htdocs/scss/components/progress-bars is an example.

    For components where you want access to variables, useful for things like line-height, include the following code at the top:

    $include-html-classes: false;
    @import "foundation/variables", "foundation/components/global";

    That won't work for colors though! Things that specify colors will have to go into the site skins.

    To use a component from a .tt file:

    [% dw.need_res( { group => "foundation" }, "stc/css/components/component-name.css" ); %]
    
  • htdocs/scss/mixins - mixins are fragments of code that can be used by multiple elements. Currently we have one mixin which hides elements visually but keeps them visible to screenreader. This replaces the... three or four different ways we were doing it before

    To use a mixin, from an SCSS file:

    @import "mixins/screenreader-friendly";
    .element-name {
        @extend %screenreader-friendly;
    }
  • htdocs/scss/pages - stylesheets for individual pages. Directory should mirror the URL structure of your pages as much as possible. But really we should try not to have anything here...

  • htdocs/scss/skins - stylesheets for site skins. Includes the actual skins and shared files, e.g., skiplinks, alert colors

JS directory structure

There's a lot more of the old JS hanging around in htdocs/js, but new files should follow the same basic structure as above:

  • htdocs/js/foundation - scripts from the Foundation library

  • htdocs/js/jquery - scripts from the jQuery / jQuery UI library

  • htdocs/js/components - individual components, suitable for inclusion on multiple pages

  • htdocs/js/pages - onload for individual pages; no examples yet. Preferably just the bare minimum of

    $(document).load( function() {
        $(".element").someComponent();
    

    } );

fu: Close-up of Fu, bringing a scoop of water to her mouth (Default)
[personal profile] fu

So I'm here to present to you all our new style guide. That's on my public dev server, not yet on dreamwidth.org, so don't worry if it's a bit slow.

That page contains the components that we'll be using across the site. The goal is to have every component documented there and to have as little per-page styling as possible. That does two things: make appearance / interactions consistent across the site (good for users); make it easy to refer to design decisions when writing pages (good for developers, especially those of us who aren't really frontend people)

We've used the Foundation framework as our basis for our redesign. That gives us a clean, responsive design that is really easy to make work on large and small screens -- we've had a huge problem with our site headers and certain pages on phones / tablets; it's about time to fix that.

I recommend going over the Foundation documentation btw. They have excellent documentation. It's a good way to get started.

All our Foundation work uses SCSS. SCSS is a CSS preprocessor and it works like CSS but with additional features. You can have variables, if statements, mixins (fragments of code that are used by more than one thing).

It requires an additional compilation step to get all this goodness, but that's possible with one command. Try the following:

  • pull in the latest changes on develop

  • Go to /dev/style-guide on your 'hack. It'll be completely unstyled

  • Run this command:

    compass compile
    

    A couple lines will scroll by, looking like this:

    ...
    create htdocs/stc/css/foundation.scss
    create htdocs/stc/css/normalize.scss
    ...
    create htdocs/stc/css/skins/celerity.scss
    create htdocs/stc/css/skins/gradation/gradation-horizontal.scss
    create htdocs/stc/css/skins/gradation/gradation-vertical.scss
    create htdocs/stc/css/skins/lynx.scss
    

    That's good: that means your SCSS files have been turned into CSS, which you can now use.

The SCSS files themselves are in htdocs/scss. These are the files you'll be touching. After they've been compiled, the generated files are placed in htdocs/stc/css. These are the files that you'll be including on the page.

So if you have a file:

    htdocs/scss/components/foo.scss

You include it by doing this:

    [% dw.need_res( "stc/css/components/foo.css" ) %]

One last thing, the compass compile command is good, but when you're developing, the last thing you want is to have to constantly switch from the file you're editing to your terminal window to run a command. Luckily, someone's thought of that. What you can do instead is this:

     compass watch

That watches for any changes in the SCSS files, and compiles them into .css files automatically. Leave that running in a separate window and it'll just do its thing. Just make sure you stop it after you're done developing for a session (just like you do with your Apache servers), because it does use up some resources!

Note: it's magical but not completely so. If you're working on something in dw-nonfree, you'll have to run compass watch separately in that directory:

    cd $LJHOME/ext/dw-nonfree
    compass watch

But if you're not touching anything in dw-nonfree, then you only need to run it the once.

Please jump in, poke around! Play with things. I'm happy to answer any questions :)

I plan on making several entries on the ongoing conversion. Soon to come: error handling for forms, directory structure for CSS / JS, easier way to format messages for email, etc.

kareila: IT prepares you for a life of fighting with PCs nonstop. (sysadmin)
[personal profile] kareila
http://wiki.dwscoalition.org/wiki/index.php/MogileFS_setup

All of the information that was previously there assumed you had root on a Debian box. I've updated the page to document the steps needed to get it working for dreamhacks as well. (Until the next time things break, anyway.)

YOU'RE WELCOME.

Ahem.

That is to say, if you try using my instructions and run into problems, let me know!
fu: Close-up of Fu, bringing a scoop of water to her mouth (Default)
[personal profile] fu
I started with a desire to document how to submit changes to a release branch, and somehow ended up trying to figure out how we could organize the wiki to make it easy for us to figure out what we have and what we need.

Please read and comment on my entry in dw-wiki. I'm locking comments here so we can centralize all discussion in one place.
foxfirefey: A picture of GIR. (gir)
[personal profile] foxfirefey
So, as far as I can tell from my researching over the past few days, the only real way to get a pull request to review it is to:

* Manually add the submitter's dw-* repo as a remote in the repo on your hack (or Github I suppose, but there is no GUI advantage here).
* Manually pull the branch in question

Is there a better way to do this? Am I missing something? This seems really, well, annoying, with manually crafting URLs and whatnot, and not very user friendly. (I'm trying to make some documents on reviewing pull requests for people who are not [personal profile] fu, since we need to try and spread out that work a little.)
foxfirefey: Dreamwidth: social content with dimension. (dreamwidth)
[personal profile] foxfirefey

As of several weeks ago, all commits to the Dreamwidth codebases have gone to our new repositories on Github:

There are a few relevant wiki documents that have been fully revised to account for this change:

  • Moving your Dreamwidth installation to use Github -- these instructions will tell you how to move your current Dreamhack/dev environment over to a Github based installation
  • Dev Maintenance -- this document tells you how to keep your Github Dreamwidth based installation (and your Dreamwidth forks) updated with the code from the Dreamwidth repositories
  • Draft: Github development process -- This is the document in the least refined state, so keep in mind that it is in stronger need of suggestions and revisions. It goes through the very basics of Git workflow. This and Dev Maintenance might eventually end up merged into one document.

We are working on getting the rest of the wiki development documentation updated (see the dw_wiki post). Feel free to comment to this post with your questions/concerns about the move!

foxfirefey: Dreamwidth: social content with dimension. (dreamwidth)
[personal profile] foxfirefey
As part of our move to Github we have a lot of development documentation to update.

If you want to be part of the vanguard, take a gander at the Moving your Dreamwidth installation to use Github instructions. People following them are encouraged to log into the #dreamwidth-dev channel on IRC for real time assistance.
denise: Image: Me, facing away from camera, on top of the Castel Sant'Angelo in Rome (Default)
[staff profile] denise
[personal profile] crschmidt pointed out in his bug walkthrough that the process of running the automated test suite is not very well documented! And I agree; I know I don't run the tests when I'm patching things since I don't really know much about it, and I don't contribute tests along with my code because I actually don't know how to write them or what they should do. I am probably not the only one! (The only thing I know about the test suite is that you shouldn't run it in production because it does weird and wonky things to the database.)

I do know that our test suite coverage is not extensive at all, and we've talked before about moving to a more test-driven devlopment mindset. If anybody wanted to write up some documentation on the Wiki about both how to run the tests (and what you should look for) and how to write new tests (and when you should), that would be awesome and I would love you forever. (Well, I mean, I love you all forever already. But I'd love you more forever.)

Profile

dw_dev: The word "develop" using the Swirly D logo.  (Default)
Dreamwidth Open Source Development

June 2025

S M T W T F S
1234567
89101112 1314
15161718192021
22232425262728
2930     

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

Expand All Cut TagsCollapse All Cut Tags
Page generated Jun. 16th, 2025 11:43 am
Powered by Dreamwidth Studios