Planet CDOT

November 27, 2014

Gary Deng

Build Firefox Browser

I really want to build Chrome,but my laptop doesn’t have enough ram to build it. It took me 24 hours and failed. As the result, I decide to go for Firefox. Surprisingly,The Mozilla build instruction is much easier to follow. I have a Ubuntu 12.04 OS running on VMWare, Maybe I can build the Firefox browser on my virtual machine, unfortunately, I failed with a couple of errors and 75 warning messages. Finally, I have to try it on my Windows 8.1 OS, and it only took me an hour to build it.

Step one: Install build prerequisites (Visual Studio, mozilla-build)
Step two: Run start-shell-msvc2013.bat on my Windows Command Prompt

Even if you are on 64-bit Windows, do not use the start-shell-msvcNNNN-x64.bat files (unless you know what you’re doing). Those files are experimental and unsupported.

Step three: Get the source code use command

hg clone

Step four: build it and run it


If you want to save time and avoid unnecessary mistakes before diving into a big project, read the documentation carefully.

by garybbb at November 27, 2014 04:20 AM

Hosung Hwang


Issue : Android Webview Shell example crashes on Android 4.0.4

Today, I built the source code as a debug build.

1. Debug build

Release build : hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ time ninja -C out/Release android_webview_apk

Debug build : hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ time ninja -C out/Debug android_webview_apk

This seems weird because out/Release and out/Debug looks like setting folder. But document clearly says, “$ ninja -C out/Debug chrome. For a release build, replace out/Debug with out/Release”.

Debug build and run:

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ time ninja -C out/Debug android_webview_apk
ninja: Entering directory `out/Debug’
[4482/11953] ACTION Compiling media_java java sources
real 137m52.642s
user 517m22.844s
sys 24m53.369shosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ build/android/ –apk AndroidWebView.apk –apk_package –debug
hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ build/android/adb_run_android_webview_shell
Starting: Intent { act=android.intent.action.VIEW dat= }

It still crashed.

2. Try debug

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ build/android/adb_gdb_android_webview_shell –start
Starting: Intent { }
Extracting system libraries into: /tmp/hosung-adb-gdb-libs
Extracting system libraries into: /tmp/hosung-adb-gdb-libs
Pulling from device: /system/bin/linker
Pulling from device: /system/lib/egl/
Pulling from device: /system/lib/
Pulling from device: /system/lib/
Pulling from device: /system/lib/[…]Pulling device build.prop
130 KB/s (11056 bytes in 0.082s)
/system/bin/app_process.real: No such file or directory
GNU gdb (GDB) 7.6
Copyright (C) 2013 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <;
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law. Type “show copying”
and “show warranty” for details.
This GDB was configured as “–host=x86_64-linux-gnu –target=arm-linux-android”.
For bug reporting instructions, please see:
Attaching and reading symbols, this may take a while..warning: Unable to find dynamic linker breakpoint function.
GDB will be unable to debug shared library initializers
and track explicitly loaded dynamic code.
0x400527bc in __ioctl () from /tmp/hosung-adb-gdb-libs/system/lib/

(gdb) bt
#0 0x400527bc in __ioctl () from /tmp/hosung-adb-gdb-libs/system/lib/
#1 0x4006df40 in ioctl () from /tmp/hosung-adb-gdb-libs/system/lib/
#2 0x4013face in android::IPCThreadState::talkWithDriver(bool) () from /tmp/hosung-adb-gdb-libs/system/lib/
#3 0x4013fe80 in android::IPCThreadState::waitForResponse(android::Parcel*, int*) ()
from /tmp/hosung-adb-gdb-libs/system/lib/
#4 0x401404e0 in android::IPCThreadState::transact(int, unsigned int, android::Parcel const&, android::Parcel*, unsigned int) ()
from /tmp/hosung-adb-gdb-libs/system/lib/
#5 0x4013d4d6 in android::BpBinder::transact(unsigned int, android::Parcel const&, android::Parcel*, unsigned int) ()
from /tmp/hosung-adb-gdb-libs/system/lib/
#6 0x401b0f32 in ?? () from /tmp/hosung-adb-gdb-libs/system/lib/
#7 0x40854df4 in dvmPlatformInvoke () from /tmp/hosung-adb-gdb-libs/system/lib/
#8 0x4088f2be in dvmCallJNIMethod(unsigned int const*, JValue*, Method const*, Thread*) ()
from /tmp/hosung-adb-gdb-libs/system/lib/
#9 0x40866c50 in dvmJitToInterpNoChain () from /tmp/hosung-adb-gdb-libs/system/lib/
#10 0x40866c50 in dvmJitToInterpNoChain () from /tmp/hosung-adb-gdb-libs/system/lib/
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

After error dialog poped up, gdb command was possible. And the call stack at that moment seemed not meaningful.

3. Try in the emulator

I started Android 4.4.2 emulator and tried to launch it there. The reason was because usually using eclipse I can see more meaningful messages.

In Android 4.4.2 emulator, the crash happened the same way. This was more serious. In this case, the fact that it worked well in my GalaxyS 4 Android 4,4,2 means nothing.

And the log says :

I/DEBUG(54): b6f39f6c 28006800 e02cd1e6 46294630 f00d4622
I/DEBUG(54): b6f39f7c 1c43e8f8 d11e4607 f9c4f001 29046801
W/ActivityManager(388): Process has crashed too many times: killing!
W/ActivityManager(388): Force finishing activity
D/Zygote(57): Process 1160 terminated by signal (6)

AwShellActivity is the test activity that uses this webview. It could be the test program’s problem. The path is ‘/src/android_webview/test/shell/src/org/chromium/android_webview/shell/’

4. Chrome Shell
I built and ran Chrome Shell in the 4.0.2 device.

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ ninja -C out/Release chrome_shell_apk
ninja: Entering directory `out/Release’
[1383/2469] ACTION Compiling chrome_java java sources
../chrome/android/java/src/org/chromium/chrome/browser/ warning: [deprecation] onError(String) in UtteranceProgressListener has been deprecated
public void onError(final String utteranceId) {
../chrome/android/java/src/org/chromium/chrome/browser/ warning: [deprecation] speak(String,int,HashMap<String,String>) in TextToSpeech has been deprecated
int result = mTextToSpeech.speak(text, TextToSpeech.QUEUE_FLUSH, params);
2 warnings
[2469/2469] STAMP obj/chrome/chrome_shell_apk.actions_rules_copies.stamphosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ build/android/ –apk ChromeShell.apk –release

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ build/android/adb_run_chrome_shell
Starting: Intent { act=android.intent.action.VIEW dat= }

Chrome Shell worked very well.

5. Content Shell
And I built and ran Content Shell in the 4.0.2 device.

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ ninja -C out/Release content_shell_apk

ninja: Entering directory `out/Release’
[991/991] STAMP obj/content/content_shell_apk.actions_rules_copies.stamp
hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ build/android/ –apk ContentShell.apk –release
hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ build/android/adb_run_content_shell
Starting: Intent { act=android.intent.action.VIEW dat= cmp=org.chromium.content_shell_apk/.ContentShellActivity }

Content Shell worked very well.

6. Conclusion 

Unfortunately, only WebView shell has problems. However, this is not the problem of rendering engine part. It is of the test program or some codes for WebView functionality. I hope the former.

Next step would be making test app using Chromium WebView. If it has no problem, problem is test program’s. If it has the same problem, I need to look at WebView code part. It will be tough.

Or may be I can port the WebView shell to eclipse for comfortable debugging.

by Hosung at November 27, 2014 01:00 AM

November 26, 2014

Linpei Fan

Building Open Source Browser – Firefox on Windows

I followed the instruction to build Firefox in my laptop, which is x64 system, using OS Windows8.1.

First of all, check the prerequisites. 
Check if all of windows build prerequisites are installed in the computer. At this point, I have to install MozillaBuild package. I downloaded and installed the latest version of MozillaBuild in fold c:\mozilla-build.

Second, get the source code.
Mozillar use Mercurial repository to hold the source code.
To get the source code, I have to have Mercurial installed in my laptop. I downloaded and installed TortoiseHg, which a Windows shell extension and a series of applications for the Mercurial distributed revision control system, similar with TortoiseGit and TortoiseSVN. It saves from getting lost in the instruction of Windows Install on mercurial website.

Next, start the building environment.
In folder c:\mozilla-build, type start-shell-msvc2013.batto setup building environment because I have install visual studio 2013 installed in laptop. Then it enters a linux mode in window power shell.
Running start-shell-msvc2013-x64.bat generated the error in building (4:21.90 configure: error: You are targeting i386 but using the 64-bit compiler).

Following, build the project.
cd into mozilla-central subdirectory in above powder shell, and run command ./mach build. It will take some time to build the project. It takes 3 to 4 hours to build the project. In the end, it will show a success message if the build finishes and succeeds as follows:

Finally, run the project

Run command ./mach run in the same fold as above. Firefox Nightly browser will be started as follows:

by Lily Fan ( at November 26, 2014 09:44 PM

Ryan Dang

Release 0.4

I worked on mobile appmaker again for my release 0.4. In this release, I picked issue UI – Implement Updated Colorpicker

My Job is to update the UI for the current color picker. Display a check mark for selected color and only allow 1 color block to be selected at the time. To display check mark for selected user, I use font awesome check mark and nested it inside the color block. In the font awesome tag, I do a check for if the blocked is selected or not to display the check mark accordingly. To only allow 1 color to be selected at the time, I added one extra variable to check for if the main color is selected and one extra boolean parameter for the function onSelect() to check for when the main color is clicked and when the subcolor is clicked.

The hardest part about this release is making the check mark align vertically inside the color block. It took me at least three hours trying to figure out how to vertically align the check mark.

The pull request for this release can be found here [#320]Update UI for Colorpicker

by byebyebyezzz at November 26, 2014 08:44 PM

Glaser Lo

[Late] Working on appmaker (Release 0.3)

For Release 0.3, in order to have different experience, I chose a bigger project to work on – Appmaker.

Pull request:

* previous pull request for 0.2 (since I forgot put the link into 0.2 blog post)

The first problem I encounter is an error about “loginHost for webmaker-auth”:

Warning: PUBLISH_HOST is unset. See for more info.
Warning: ASSET_HOST is unset. See for more info.

 throw new Error('(webmaker-auth): secretKey was not passed into webmaker-a
Error: (webmaker-auth): secretKey was not passed into webmaker-auth
 at new module.exports (/home/rickdom/seneca/osd600/appmaker/node_modules/webmaker-auth/index.js:14:11)
 at Object.<anonymous> (/home/rickdom/seneca/osd600/appmaker/app.js:81:20)
 at Module._compile (module.js:456:26)
 at Object.Module._extensions..js (module.js:474:10)
 at Module.load (module.js:356:32)
 at Function.Module._load (module.js:312:12)
 at Function.Module.runMain (module.js:497:10)
 at startup (node.js:119:16)
 at node.js:906:3

However, this took me a whole week to figure out how to fix the problem… End up the reason I got this because I forgot to do one step:

cp sample.env .env

Really, it is too important to follow every single step in, especially for big project.

In this release. I also found a really easier way to search for elements on the page.  When I want to get the code that handles the close button on tab, I can simply search for the title/alt, “Delete this card”, to get the file that contains the code.

by gklo at November 26, 2014 12:43 AM

Hosung Hwang

CordovaStabilizer – Chromium Android WebView Issue I

Yesterday, release build of Chromium Android Webview Shell example didn’t run successfully on Android 4.0.4 (Ice Cream Sandwich). Today, I uninstalled it and installed it again. The result was the same. Crashed. I analysed install scripts.

1. src/build/android/

command :
–apk AndroidWebView.apk

AndroidWebView.apk file is here :

This file is python install script. It uses rather big python libraries in src/build/android/pylib folder, and it seems not relevant to this Android version issue.

2. adb_run_android_webview_shell

command :

content of this file :
adb shell am start \
-a android.intent.action.VIEW \
-n \
${optional_url:+-d “$optional_url”}

This bash shell script generates a command something like this :
adb shell am start \
-a android intent.action.VIEW \
-n \

Nothing special.

3. Possible measures :

1) Debug it to see where the crash occurs.
command : $ build/android/adb_gdb_android_webview_shell
To do this, I need to build again using debug option. And debug using gdb. OMG.

2) Build release version again
Yesterday, while I am building, I interrupted twice because I had to do some other work. However, it works on 4.4.2. Possibility is low.

3) Look into build script
Build system is ninja :

depot_tools/ninja is a bash shell script, and in the script, it decide what the running OS is. Depending on OS type it launches ninja-linux32, ninja-linux64, or ninja-mac, which are executables.
more information :

Number 1) Debugging seems to be faster way than others.
more information about debugging :

by Hosung at November 26, 2014 12:05 AM

November 25, 2014

Hosung Hwang

CordovaStabilizer – Build Chromium Android WebView

Last week, I built and tested Linux desktop version of Chromium.
It is time for android version chromium.

Basically Android WebView is system framework component that is used by every WebView based apps. Since Android 4.4(kitkat), Android WebView is implemented based on Chromium. Therefore, if we could use the newly built WebView in the Cordova, it would be the same as default WebView implementation. Also, even in the older version of Android, it should be the same in terms of rendering.

For Chromium Android build, 1) Download Depot_tools and 2) Set path steps from last posting(CORDOVASTABILIZER – BUILD CHROMIUM 1) are need to be done.

1. clone source code
$fetch android

hosung@hosung-Spectre:~/cdot$ time fetch android
Running: gclient config –spec ‘solutions = [{u'”‘”‘managed'”‘”‘: False, u'”‘”‘name'”‘”‘: u'”‘”‘src'”‘”‘, u'”‘”‘url'”‘”‘: u'”‘”‘'”‘”&#8216;, u'”‘”‘custom_deps'”‘”‘: {}, u'”‘”‘deps_file'”‘”‘: u'”‘”‘.DEPS.git'”‘”‘, u'”‘”‘safesync_url'”‘”‘: u'”‘”””‘”‘}]
target_os = [u'”‘”‘android'”‘”‘]’
Running: gclient sync[0:01:00] Still working on:
[0:01:00] src[0:18:51] Still working on:
[0:18:51] src/third_party/WebKit
[0:18:51] src/third_party/android_tools[…][0:36:57] Still working on:
[0:36:57] src/third_party/WebKit
Syncing projects: 100% (94/94), done.________ running ‘/usr/bin/python src/build/’ in ‘/home/hosung/cdot’________ running ‘/usr/bin/python src/build/’ in ‘/home/hosung/cdot’
INFO: –Syncing nacl_x86_glibc to revision 13880–[…]Running: git submodule foreach ‘git config -f $toplevel/.git/config submodule.$name.ignore all’
Running: git config –add remote.origin.fetch ‘+refs/tags/*:refs/tags/*’
Running: git config diff.ignoreSubmodules allreal 41m44.678s
user 64m57.987s
sys 6m3.203s

At first I used the same instruction as desktop Chromium.
I expected chrome.apk. However, when I finished building for more than 3 hours, there were desktop version binary. I could run it on my desktop. Same as last week.
This means that the source code is the same both cases ‘fetch chromium’ and ‘fetch android'; little setting would be different.
Anyway, for the android build, the instruction is different.
Good thing is that I can erase last week’s source code and object files, 25GB, SSD in my laptop is only 128GB.

2. Configure Target

Instruction :

Instruction says that if I want to build from the source fetched by ‘fetch chromium’ this is need.
$echo “target_os = [‘android’]” >> .gclient && gclient sync
Changing target and pulling depencencies for android.

3. Configure GYP

“GYP is the meta-makefile system used in chromium to generate build files for the various platforms”
hosung@hosung-Spectre:~/cdot/ChromiumAndroid$ echo “{ ‘GYP_DEFINES': ‘OS=android’, }” > chromium.gyp_env

4. Update Project

hosung@hosung-Spectre:~/cdot/ChromiumAndroid$ gclient runhooks
This updates project based on the configuration.

5. Install JAVA SDK

$sudo apt-get install openjdk-7-jdk

and make sure OpenJDK is select as default by typing :

sudo update-alternatives –config javac
sudo update-alternatives –config java
sudo update-alternatives –config javaws
sudo update-alternatives –config javap
sudo update-alternatives –config jar
sudo update-alternatives –config jarsigner

For example, like this :

hosung@hosung-Spectre:~/cdot/ChromiumAndroid$ sudo update-alternatives –config javaws
There are 2 choices for the alternative javaws (providing /usr/bin/javaws).Selection Path Priority Status
0 /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/javaws 1071 auto mode
1 /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/javaws 1061 manual mode
* 2 /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/javaws 1071 manual modePress enter to keep the current choice[*], or type selection number: 2

*add : This command is to upload apk file to device.

6. Install Build Dependencies

hosung@hosung-Spectre:~/cdot/ChromiumAndroid$ src/build/
Running as non-root user.
You might have to enter your password one or more times for ‘sudo’.Skipping debugging symbols.
Skipping ARM cross toolchain.
Skipping NaCl, NaCl toolchain, NaCl ports dependencies.
Ign stable InRelease
Hit stable Release.gpg
Hit stable Release
Ign trusty-security InRelease[…] complete.

7. Plug in Android Device

I do not understand why it says about checking device connection at this point before building.
Anyway, I pluged in the device and did this:

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src/third_party/android_tools/sdk/platform-tools$ adb wait-for-devices
* daemon not running. starting it now on port 5037 *
* daemon started successfully *

If it returns immediately, it is successfully connected.
*add : Device connection can be done after building.

8. Build

There are three options of build.

1) Content Shell : Core for rendering
2) Chrome Shell : Core + Chrome Features(Translate, Autofill etc..)
3) WebView Shell : Core + WebView Features

I am not sure which one between Content Shell and WebView Shell fits to this project. If Cordova uses simply WebView Activity, we could change it to our WebView, with exactly the same interfaces. I hope so.

Document says, drawback of WebView Shell is that it runs software rendering mode only. According to my experience, webpage in the webview was usually slower than in the default browser. It might be because of this.

This time, I go with WebView Shell, and release mode.

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ time ninja -C out/Release android_webview_apk
ninja: Entering directory `out/Release'[…]../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] speak(String,int,HashMap<String,String>) in TextToSpeech has been deprecated
return mTextToSpeech.speak(text, queueMode, params);
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
../content/public/android/java/src/org/chromium/content/browser/accessibility/ warning: [deprecation] addAction(int) in AccessibilityNodeInfo has been deprecated
19 warnings
[3680/4655] ACTION Linting android_webview_java../../../../../tmp/tmpbPBhw8/0/ Use “Gravity.END” instead of “Gravity.RIGHT” to ensure correct behavior in right-to-left locales: RtlHardcoded [warning]
((FrameLayout.LayoutParams) params).gravity = Gravity.RIGHT;
Lint found 1 new issues.
– For full explanation refer to out/Release/gen/android_webview_java/lint_result.xml
– Wanna suppress these issues?
1. Read comment in build/android/lint/suppressions.xml
2. Run “python build/android/lint/ out/Release/gen/android_webview_java/lint_result.xml”

Lot’s of deprecated APIs.
And then,

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ build/android/ –apk AndroidWebView.apk –apk_package –release

*add : –apk is double dash. WordPress changes it to long one dash. copy from here


hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ ./build/android/adb_run_android_webview_shell
Starting: Intent { act=android.intent.action.VIEW dat= }

On the device, “Android WebView Test Shell” activity shows and immediately a dialog shows “Unfortunately, AwShellApplication has stopped.”.

I pluged in my GalaxyS 4, and tried to run.

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ ./build/android/adb_run_android_webview_shell
Starting: Intent { act=android.intent.action.VIEW dat= }
Error type 3
Error: Activity class {} does not exist.

It seems to be just launching activity that are already in the device.
So, I did this command, this is to deploying the apk to the device.

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ build/android/ –apk AndroidWebView.apk –apk_package –release

And then,

hosung@hosung-Spectre:~/cdot/ChromiumAndroid/src$ ./build/android/adb_run_android_webview_shell
Starting: Intent { act=android.intent.action.VIEW dat= }



At least it runs in Android 4.4.2.

I have no idea why it crashes in Android 4.0.4 or in this particular XT885 model.

*add : lots of warning about deprecated APIs while compiling may cause some problem.

by Hosung at November 25, 2014 12:09 AM

Hunter Jansen

Bowtie Part 2

Bowtie Part 2

Written by Hunter Jansen on November 25, 2014

This post continues on in my exploration of porting Bowtie from last post, so if you haven’t read that yet, you should probably start there.

With that little preface out of the way, let me say that I’ve actually made a bunch of progress - I’ll try to be succinct with what’s happened in the last few days since the previous post, but to be honest - it’s kinda a lot. If I had the time to sit down and post an intermediate entry, that would have been great - but here we are with me trying to remember all the steps I’ve taken to get to where I am today.


After realizing that I actually get to work on Bowtie I figured I should check in on the git repository to see if the problem’s either already been addressed, or if like some other projects I’ve looked at, the source code’s changed drastically. I found the repo here. Aside from there being a minor version update, it didn’t look like anything was all that different. So, I forked and cloned the project and began to investigate to see if I could begin to nail down the problem and maybe even start to think of a solution.

Makefile stuff

I said in the last post that the -m64 flag was causing an issue in the makefile, which was the very first issue I needed to resolve. However, when I pulled the source in from github, a new issue arose; I received an error that told me that bowtie can only install on 64 bit systems.

‘Well that’s odd’, I thought to myself, ‘I’m ON a 64 bit system. Something must be funky.’ After a bit of digging around I found the following:

ifeq (1,$(LINUX))
    ifeq (x86_64, $(shell uname -p))

ifeq (32,$(BITS))
    $(error bowtie2 compilation requires a 64-bit platform )

Huh, so unless I’m on an x86_64 machine, (or other platforms, it does work on windows and mac; this is just the check for linux) it thinks I’m on a 32 bit machine and thus won’t build. Neat. SO to resolve this, I added in another ifeq statement checking if the uname is aarch64, and if so make BITS=64.

ifeq (1,$(LINUX))
    ifeq (x86_64, $(shell uname -p))
    ifeq (aarch64, $(shell uname -p))

I tried looking for an ‘or’ style way to deal with ifeq, but I couldn’t find one. it’s possible that there’s a more elegant solution to deal with the multiple conditions, but for now this is good enough. So NOW when making it with this adjustment, I run into the previous nondescript error that I ran into previously. I tracked that error down to this part a bit further down:

DEBUG_FLAGS = -O0 -g3 -m64

This is setting the various gcc flags for optimization and the -m64 flag that I mentioned before. So to get around this on aarch64 I did this:

ifeq (aarch64, $(shell uname -p))
    DEBUG_FLAGS = -O0 -g3
    DEBUG_FLAGS = -O0 -g3 -m64
    RELEASE_FLAGS = -O3 -m64

A simple solution that should solve the m64 flag problem. After I finished that off, that looks to be the end of my makefile adjustments (for now at least).


Now that that’s solved, the build progresses further - progress! But at this point I’m receiving errors about exceptions in assembler code in a /third-party/cpuid.h file.

Checking inside that file, I discover a bunch of assembler for x84 dealing with the cpuid and returning either a 1 or a 0 depending on some stuff. So, I start investigating. . . It turns out that this file is actually an older version of the cpuid.h file included in gcc that’s been retained for no reason that anyone can think of and the community has actually asked for clarification for this choice a couple times. Upstream seems to have been unresponsive on this topic. So, this file was the source of my next investigation.

I began by removing the reference to that file in the compile command from last time and trying that on an x86 system and doing a couple checks. Everything seemed gravy from my quick tests, so I moved on to trying it on the arm machine, thinking that maybe gcc’s more updated version would have already addressed this issue.

Nope. A new slew of errors saying that cpuid.h is not defined, everything breaks

So after a lot of time searching for a simple, already implemented solution I came up with absolutely nothing regarding cpuid.h for gcc on arm. I reached out to my prof, and discovered that cpuid is an x86 only call. Neat! After further investigation, I found that the CPUid for aarch is kept in a register, but then I began to question what it was actually used for.

Digging into the file in question I eventually found that all the assembler in the file boils down to use in this one function:

static __inline int
__get_cpuid (unsigned int __level, unsigned int *__eax, unsigned int *__ebx, unsigned int *__ecx, unsigned int *__edx)
  unsigned int __ext = __level & 0x80000000;

  if (__get_cpuid_max (__ext, 0) < __level)
    return 0;

  __cpuid (__level, *__eax, *__ebx, *__ecx, *__edx);
    return 1;

So it returns either 1 or 0, I reckon it’s a flag of some sort, but I haven’t yet the chance to dig into what it’s actually used for. However, I figured that since cpuid isn’t so much used in aarch the same way that it’s needed for x86 (there’s not the backwards compatibility issues that x86 need to take into account) that to try to get everything working and progressing, I’d just always have this function return 1.

This involved commenting out all the assembler code in the rest of the file and changing the file to:

static __inline int
__get_cpuid (unsigned int __level, unsigned int *__eax, unsigned int *__ebx, unsigned int *__ecx, unsigned int *__edx)
  unsigned int __ext = __level & 0x80000000;

 // if (__get_cpuid_max (__ext, 0) < __level)
  //  return 0;

 // __cpuid (__level, *__eax, *__ebx, *__ecx, *__edx);
  return 1;

Now obviously, should this be part of a working solution, I’ll toss in some ifdef statements to properly handle this, but for now it’s just to see if this can contribute to making everything work on arm.


The above steps got me even FURTHER in the compile steps! But now I’m hit with a new error:

/tmp/ccqEjsKY.s: Assembler messages:
/tmp/ccqEjsKY.s:6723: Error: unknown mnemonic `popcntq' -- `popcntq x3,x10'
/tmp/ccqEjsKY.s:6753: Error: unknown mnemonic `popcntq' -- `popcntq x3,x8'
/tmp/ccqEjsKY.s:18031: Error: unknown mnemonic `popcntq' -- `popcntq x3,x4'
/tmp/ccqEjsKY.s:18074: Error: unknown mnemonic `popcntq' -- `popcntq x3,x9'

Aww jeez, what the heck is this mess o.O? After some more poking around, I find that the only reference to popcntq in all of the source is in the ebtw.h file. So let’s check it out.

In here, I find the reference to popcntq:

        inline static int pop64(uint64_t x) {
            int64_t count;
            asm ("popcntq %[x],%[count]\n": [count] "=&r" (count): [x] "r" (x));
            return count;

Oki doke - after some research I find that popcntq is x86_64 syntax and the syntax for arm64 is just popcntq. So I tried changing that, but it still didn’t work appropriately. Investigating further, I discovered that there’s a builtin gcc version of popcount. I’m a fan of functions already built into compilers when they’ve been manually written in assembler, because it PROBABLY means that it’s been done better and more universally than the handwritten assembler. So I tried replacing the assembler with the builtin version:

        inline static int pop64(uint64_t x) {
            int64_t count;
            count = __builtin_popcount(x);
                //asm ("popcnt %[x],%[count]\n": [count] "=&r" (count): [x] "r" (x));
            return count;

I need to double check to make sure that that does essentially the same thing as the inline, but for now we’ll leave it at that. Let’s try our make again.

. . .
The make takes a lot longer at this point, filling me with hope.
. . .

When it’s finally finished, I try running a basic test using the built in e-coli genome:

./bowtie e_coli reads/e_coli_1000.fq

HOLY MOLY, it works!

But just to be sure, I decided to remove the bowtie file and make again. When I try to make I get the answer: Nothing to be done for ‘all’.

Hmm, that’s weird; why won’t it make my bowtie?

So I ended up removing all the other bowtie related programs that the makefile should make (bowtie-build and bowtie-inspect). Following this, I run make again - again, it takes a long time to complete. When it’s finished this time around I run the ls command and there’s no ‘bowtie’ command anywhere. Thinking perhaps there’s a fluke or something I’ve missed, I try making again; Nothing to be done for all. Clearly something’s up, and bowtie’s either failing its build or being put somewhere else. SOooo it’s back to the :

Makefile . . . Again

When I was playing around in the makefile earlier, I took note of where it was setting up the command to compile the bowtie program (Some of my early debugging brought me here). I discovered a weird difference between the git version I’d pulled and the version from the fedpkg code that I’d pulled before. The fedpkg version has:

BIN_LIST = bowtie-build \
           bowtie \


bowtie: ebwt_search.cpp $(SEARCH_CPPS) $(OTHER_CPPS) $(HEADERS) $(SEARCH_FRAGMENTS)
                $(DEFS) $(NOASSERT_FLAGS) -Wall \
                $(INC) \
                -o $@ $< \
                $(OTHER_CPPS) $(SEARCH_CPPS_MAIN) \
                $(LIBS) $(SEARCH_LIBS)

But the github version has neither. Which I thought was quite queer, how are you supposed to tie your bows without a bowtie? So for testing sake, I added it back into the Makefile. After another make, bowtie was there!

So I ran through the e-coli test again and a couple other steps in the getting started guide and they’ve all worked so far! I was so happy I stood up and breathed a sigh of relief when it all started working. I think everyone else in class was too entrenched in their own mind crushing work to notice, but it was a big victory for me; FINALLY a project I could work on AND results.


So there’s a few things I need to do next. Here they are in no specific order:

  • Reach out to upstream, introduce myself and ask why the build part of the Makefile is gone
  • Fix up the code to include ifdefs instead of the currently commented out and overwritten solutions I have
  • Test - Benchmark on x86 before and after to make sure my changes haven’t negatively effected anything
  • Test - Find as many test scenarios as I can to make sure that they all behave the same on each architecture
  • Create a pull request

I’m pretty excited about the progress I’ve made so far, quite a change from the frustration of the last few weeks

Until next time

November 25, 2014 12:00 AM

Andrew Li

Compiling Lynx

Lynx is a text based web browser. The project began in 1992 and is still in active development. Lynx remains relevant since it still provides value for many situations. For example, it is great for visually impaired users because everything is text-based making it easy to navigate the web using screen reading technology.

Some feature Lynx support are: SSL, HTTP cookies, browsing history and page caching. However, it does not support Javascript or Adobe flash. Their README has sections to help trouble shoot compiling issues on different ports. They list a bunch of UNIX flavours known to work but they did not mention Mac OSx - but it works.

Here are the steps to compile Lynx for Mac OSx 10.10.1 (Yosmite):

tar -zxvf ./lynx2-8-8.tar.gz
make install

After compiling you should see something similar to this

Lynx compiled

Type the program name lynx and provide a website as a parameter


Lynx browser

November 25, 2014 12:00 AM

November 24, 2014

Yoav Gurevich

Appmaker Template Picker Progress Update

The second last week of this semester has already begun, and between 20 other errands to accomplish in this always merry and relaxed time frame, I have managed to get a hold of Pomax and discuss with him clarifications on how to go about moving forward and accomplishing my iterative solution for Appmaker Issue #2348. His only qualm with my initial pull request was that the current URL for arriving at the custom Appmaker template gallery list page looks very unreadable and hash-like in its address. What we have agreed to in the meantime was that Pomax will initiate a separate Webmaker issue in the meantime to accommodate this request by creating functionality for the URL address "" and that will be the de-facto address that I will use when creating my next pull request. Since the expert on implementing these sort of HTTP Option handlers and custom route logic is away, I will obviously still be using the presently functional old URL to test my logic up until the final commit.

If all of the stars will align, and my google-fu research will prove immediately effective in helping me accomplish my initial task for this last milestone of getting an iframe preview to fetch the resource on a user's mouseclick event, I might even able to hop on this newly created issue and contribute to its completion.

In all likelihood, my last update for this semester will be the pull request announcement. Stay tuned, as the cliffhanger is quickly nearing its conclusion.

by Yoav Gurevich ( at November 24, 2014 08:39 PM

Edwin Lum

Onwards with the project!

A little recap regarding the progress on my project for SPO600, as of now the package builds and passes most of the tests. The next step would be to reach out to the upstream community, touch bases and to let them know that their package runs passes most tests on Aarch64.

There are several alternatives as to reaching out to this community. Firstly right on their site they have a forum, however it seems like it is more meant for getting help.

They also have an IRC channel on freenode and this will be the approach I am taking, as it appears that other than users they have some developers that are on the channel.

The third way, and probably the most proper for discussing development topics is by joining their mailing list, which seems to be the way to go. However personally I am more comfortable with trying their irc channel first before subscribing to their mailing list, although there is a chance I will probably be pointed to this after chatting up the dudes on the IRC.

Next post will hopefully have some response from upstream and maybe even closing this project off as finished!

by pyourk at November 24, 2014 05:55 PM

Andor Salga (asalga)

Gomba 0.2


I’ve been busy nursing my cat back to health, so I missed blogging last Saturday :( He’s doing a bit better, so I’m trying to stay hopeful.

Today I did manage to find some time to catch up on my blogging, so here are the major changes on Gomba:

  • Fixed a major physics issue (running too quick & jumping was broken)
  • Added coinbox
  • Fixed kicking a sprite from a brick
  • Added render layers

Rendering Layers

The most significant change I added was rendering layers. This allows me to specify a layer for each gameobject. Clouds and background objects must exist on lower layers, then things like coins should be a bit higher, then the goombas, Mario and other sprites even higher. You can think of each layer as a transparent sheet high school teachers use for overhead projectors. Do they have digital projectors yet?? I can also change a gameobject layer at runtime so when a goomba is ‘kicked’, I can move it to the very top layer (closest to the user) so that it appears as if the sprite is being remove from the world. Rendering them under the bricks would look just strange.

I used a binary tree to internally manage the rendering of the layers. This was probably overkill and I could have done away with an array, dynamically resizing it as needed if a layer index was too high. Ah well. I plan to abstract the structure even further so the implementation is unknown to the scene. I also need to fix tunnelling issues and x-collision issues too…Maybe for next month.

Filed under: Game Development, Open Source, Processing, Processing.js

by Andor Salga at November 24, 2014 05:08 PM

November 22, 2014

Frank Panico

Finally got Webmaker-app running on windows 8.1!!

After so much grueling time spent removing, updating, and downloading software and dependencies I’ve finally got it running.

All kinds of different errors would pop up just form npm install, a lot mainly to do with the maria sql dependencies, or possibly some node-gyp dependencies as well.. either way keeping track of these was becoming time consuming and costly to other work and school as well.

so I snooped into the dependencies for bower and decided that those would be important, and I installed visual studio 2013 in order to get the VC dependencies I would need, as well as the Microsoft .NET Framework 2.0 because it showed a necessity among the errors I was getting.

Npm install still showed a few errors after running the command a couple times, but my “gulp dev” eventually brought up the app so It’s safe to assume I’ve finally got something up and running.

Special thanks to Ryan Dang and Kate Hudson for their great input!

Off to work..

by fpanico04 at November 22, 2014 09:10 PM

Shuming Lin

Release 0.3 – Still Work On webmaker-app

I still work with webmark-app.  I had try few issues last week, and it was hard for me to fix those issues. Because I don’t have many experience in JS and this project, but this time is better than release 0.2. During the Release 0.2, I know what’s this project do and understand some files, so that would help me in release 0.3 such as easy find out issue locate. For this release, most hard part is ask for a bug. It looks they’re very busy this time, I didn’t get any response when I asked for a bug. A few classmates has same problem  with me, when they asked a bug or asked for help.

I took issue#503 – “Change Templates page wording/UI”. This was a easy issue when I found out solution. I spent most time to understand how structure the title works and find those file.

I know more about JS function and json after release 0.3. Here is my pull request  for  the webmark-app issue.


by Kevin at November 22, 2014 04:55 AM

Omid Djahanpour

PSmisc System Administration Tools

PSmisc is a collection of system administration tools as briefly described in one of my prior posts.

The upstream website for Siege is available here. The source is also available on the project page.

Unlike my last post where we looked at Siege, I will be getting the source for PSmisc using the fedpkg command. You can read the man page for fedpkg here.

[odjahanpour@australia pkgs]$ fedpkg clone psmisc -a
Cloning into 'psmisc'...
remote: Counting objects: 475, done.
remote: Compressing objects: 100% (293/293), done.
remote: Total 475 (delta 200), reused 376 (delta 157)
Receiving objects: 100% (475/475), 75.56 KiB | 0 bytes/s, done.
Resolving deltas: 100% (200/200), done.
Checking connectivity... done.

[odjahanpour@australia psmisc]$ pwd; ll
total 24
-rw-rw-r--. 1 odjahanpour odjahanpour 600 Nov 21 19:59 psmisc-22.13-fuser-silent.patch
-rw-rw-r--. 1 odjahanpour odjahanpour 13375 Nov 21 19:59 psmisc.spec
-rw-rw-r--. 1 odjahanpour odjahanpour 54 Nov 21 19:59 sources

You can see that there is no source archive containing the source code for the package. That’s not a problem though as the next step will download the source and apply any patches to it as seen below:

[odjahanpour@australia psmisc]$ fedpkg prep

Downloading psmisc-22.21.tar.gz
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 441k 100 441k 0 0 340k 0 0:00:01 0:00:01 --:--:-- 341k
Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.JPiD66
+ umask 022
+ cd /home/odjahanpour/pkgs/psmisc
+ cd /home/odjahanpour/pkgs/psmisc
+ rm -rf psmisc-22.21
+ /usr/bin/gzip -dc /home/odjahanpour/pkgs/psmisc/psmisc-22.21.tar.gz
+ /usr/bin/tar -xf -
+ '[' 0 -ne 0 ']'
+ cd psmisc-22.21
+ /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w .
+ exit 0

Now if we list the same directory, you will see a folder containing the source code as well as the source in archive format:

[odjahanpour@australia psmisc]$ ll
total 472
-rw-rw-r--. 1 odjahanpour odjahanpour 600 Nov 21 19:59 psmisc-22.13-fuser-silent.patch
drwxr-xr-x. 9 odjahanpour odjahanpour 4096 Oct 9 2013 psmisc-22.21
-rw-rw-r--. 1 odjahanpour odjahanpour 451833 Oct 10 2013 psmisc-22.21.tar.gz
-rw-rw-r--. 1 odjahanpour odjahanpour 13375 Nov 21 19:59 psmisc.spec
-rw-rw-r--. 1 odjahanpour odjahanpour 54 Nov 21 19:59 sources

Now we will compile PSmisc similar to the way we compiled Siege. The only argument being passed to configure is the prefix so that it installs in a folder inside my home directory:

[odjahanpour@australia psmisc-22.21]$ time ./configure --prefix=$HOME/psmisc

You’ll notice that I used the time command as I want to measure how long each step takes for compiling the package. I will post the times for each step later on in this blog as I want to compare the build times from the two machines I am using.

The first machine is an x86_64 machine, and the second one is a ARMv8-aarch64 machine.

Now I will search the source code for any inline assembly using egrep:

[odjahanpour@australia psmisc-22.21]$ egrep "__asm__|asm.*\(" -R *
src/lists.h:# define asm __asm__
src/lists.h: * echo '#include <asm-i386/processor.h>\nint main () { prefetch(); return 0; }' | \
src/lists.h: asm volatile ("prefetcht0 %0" :: "m" (*(unsigned long *)x))
src/lists.h: asm volatile ("lfetch [%0]" :: "r" (x))
src/lists.h: asm volatile ("dcbt 0,%0" :: "r" (x))
src/lists.h: asm volatile ("661:\n\t"

The search returned a few instances of inline assembly reoccurring in the file lists.h. Let’s take a closer look at lists.h to see what the assembly is doing:

43 # ifndef asm
44 # define asm __asm__
45 # endif

The first occurrence of assembly is simply to define the inline assembly start tag.

54 /*
55 * This is lent from the kernel by e.g. using
56 *
57 * echo '#include <asm-i386/processor.h>\nint main () { prefetch(); return 0; }' | \
58 * gcc -I/usr/src/linux/include -D__KERNEL__ -x c -E -P - | \
59 * sed -rn '/void[[:blank:]]+prefetch[[:blank:]]*\(/,/^}/p'
60 *
61 * on the appropriate architecture (here on i686 for i586).
62 */
63 extern inline void attribute((used,__gnu_inline__,always_inline,__artificial__)) prefetch(const void *restrict x)
64 {
65 #if defined(__x86_64__)
66 asm volatile ("prefetcht0 %0" :: "m" (*(unsigned long *)x))
67 #elif defined(__ia64__)
68 asm volatile ("lfetch [%0]" :: "r" (x))
69 #elif defined(__powerpc64__)
70 asm volatile ("dcbt 0,%0" :: "r" (x))
71 #elif !defined(__CYGWIN__) && !defined(__PIC__) && defined(__i386__)
72 asm volatile ("661:\n\t"
73 ".byte 0x8d,0x74,0x26,0x00\n"
74 "\n662:\n"
75 ".section .altinstructions,\"a\"\n"
76 " .align 4\n"
77 " .long 661b\n"
78 " .long 663f\n"
79 " .byte %c0\n"
80 " .byte 662b-661b\n"
81 " .byte 664f-663f\n"
82 ".previous\n"
83 ".section .altinstr_replacement,\"ax\"\n"
84 " 663:\n\t"
85 " prefetchnta (%1)"
86 " \n664:\n"
87 ".previous"
88 :: "i" ((0*32+25)), "r" (x))
89 #else
90 __builtin_prefetch ((x), 0, 1);
91 #endif
92 ;
93 }

There is quite a bit of assembly code here and the purpose is specific based on a few platforms. The task from what I understood relates to prefetching. There is no assembly code defined for the ARM platform, but it seems like there is a fall back in place as seen from line 89 to 91.

With some further investigation, I will attempt to include some assembly for the ARM platform. Like I mentioned in my prior post for Siege, this program is also very small and it doesn’t seem like there is really much of a benefit of including assembly for the ARM platform. I will not be able to conclude this though without first trying and comparing results. This will be my goal for the future as again, I have not done enough research on how to implement this at the moment.

I mentioned above that I wanted to do a comparison between the build time of PSmisc on the x86_64 machine and the ARMv8-aarch64 machine.

Below are the results for the build times on the two machines:


[odjahanpour@australia psmisc-22.21]$ time ./configure --prefix=$HOME/psmisc

real 0m5.924s
user 0m2.900s
sys 0m2.098s

[odjahanpour@australia psmisc-22.21]$ time make

real 0m2.007s
user 0m1.750s
sys 0m0.171s

[odjahanpour@australia psmisc-22.21]$ time make install

real 0m0.528s
user 0m0.292s
sys 0m0.144s


[odjahanpour@red psmisc-22.21]$ time ./configure --prefix=$HOME/psmisc

real 0m9.674s
user 0m4.730s
sys 0m0.880s

[odjahanpour@red psmisc-22.21]$ time make

real 0m4.913s
user 0m4.640s
sys 0m0.070s

[odjahanpour@red psmisc-22.21]$ time make install

real 0m0.685s
user 0m0.070s
sys 0m0.040s

The results indicate that it takes almost twice as long to compile and install PSmisc on the ARMv8-aarch64 machine. This to me indicates that there are optimization flags that could probably be passed to the compiler in order for PSmisc to compile faster.

I will either update this post in the future once I have done my research and further testing, or I may just create a new post and link it to this one. In either case, stay tuned for more results!

by Omid Djahanpour at November 22, 2014 02:16 AM

Hosung Hwang

CordovaStabilizer – Build Chromium 1

I cloned WebKit source code, and looked at the instruction for building. There was build instruction for XCode on Mac OS; however, no instruction for linux. Probably because Apple has lead WebKit project. I tried to build using make in the root, but failed. Without clear instruction, it takes time to figure out. I have a Macbook, but I think that is not a solution.

So, I looked at Chrome source code. the reason I started from WebKit source code is because it is smaller than Chrome source.

This link says “Chromium supports building on Windows, Mac and Linux.
You can also build the mobile versions, iOS (from Mac) and Android (from Linux).”
It clearly has instruction. Let’s start from easy one.

This page is starting point.
Depot_tools are git extensions for supporting development of chromium, blink, etc.

1) Download Depot_tools
$ git clone

hosung@hosung-Spectre:~/cdot$ git clone
Cloning into ‘depot_tools’…
remote: Sending approximately 13.97 MiB …
remote: Total 11530 (delta 7155), reused 11530 (delta 7155)
Receiving objects: 100% (11530/11530), 13.97 MiB | 7.80 MiB/s, done.
Resolving deltas: 100% (7156/7156), done.
Checking connectivity… done.

2) Set path

$ export PATH=$PATH:/path/to/depot_tools
– I added the path to ~/.bashrc

3) Fetch chromium
: $ fetch chromium

hosung@hosung-Spectre:~/cdot$ fetch chromium
Running: gclient config –spec ‘solutions = [{u'”‘”‘managed'”‘”‘: False, u'”‘”‘name'”‘”‘: u'”‘”‘src'”‘”‘, u'”‘”‘url'”‘”‘: u'”‘”‘'”‘”&#8216;, u'”‘”‘custom_deps'”‘”‘: {}, u'”‘”‘deps_file'”‘”‘: u'”‘”‘.DEPS.git'”‘”‘, u'”‘”‘safesync_url'”‘”‘: u'”‘”””‘”‘}]’
Your copy of depot_tools is configured to fetch from an obsolete URL: to update it to ? [Y/n] y
Remote URL updated.
Running: gclient sync[0:01:00] Still working on:
[0:01:00] src[0:01:10] Still working on:
[0:01:10] src[…][0:09:30] Still working on:
[0:09:30] src
Syncing projects: 0% ( 0/ 2)
[0:09:35] Still working on:
[0:09:35] src
Syncing projects: 96% (77/80) src/chrome/tools/test/reference_build/chrome_linuSyncing projects: 97% (78/80) src/tools/gyp Syncing projects: 98% (79/80) src/v8
[0:13:58] Still working on:
[0:13:58] src/third_party/WebKit[…]

[0:34:03] Still working on:
[0:34:03] src/third_party/WebKit
Syncing projects: 100% (80/80), done.

________ running ‘/usr/bin/python src/build/’ in ‘/home/hosung/cdot’

________ running ‘/usr/bin/python src/build/’ in ‘/home/hosung/cdot’
INFO: –Syncing nacl_x86_glibc to revision 13880–
INFO: Downloading package archive: gdb_i686_linux.tgz (1/2)
INFO: Downloading package archive: toolchain.tar.bz2 (2/2)
INFO: –Syncing nacl_x86_newlib to revision 13917–
INFO: Downloading package archive: naclsdk.tgz (1/1)
INFO: –Syncing pnacl_newlib to revision 14083–
INFO: Downloading package archive: binutils_pnacl_x86_64_linux.tgz (1/48)


INFO: Downloading package archive: unsandboxed_irt_x86_32_linux.tgz (48/48)
INFO: –Syncing pnacl_translator to revision 14083–
INFO: Downloading package archive: pnacl-translator.tgz (1/1)
Hook ‘/usr/bin/python src/build/’ took 105.35 secs

________ running ‘/usr/bin/python src/chrome/installer/linux/sysroot_scripts/ –linux-only –arch=amd64′ in ‘/home/hosung/cdot’

________ running ‘/usr/bin/python src/chrome/installer/linux/sysroot_scripts/ –linux-only –arch=i386′ in ‘/home/hosung/cdot’

________ running ‘/usr/bin/python src/chrome/installer/linux/sysroot_scripts/ –linux-only –arch=arm’ in ‘/home/hosung/cdot’

________ running ‘/usr/bin/python src/build/ update’ in ‘/home/hosung/cdot’

________ running ‘/usr/bin/python src/tools/clang/scripts/ –if-needed’ in ‘/home/hosung/cdot’
Trying to download prebuilt clang
–2014-11-21 13:13:53–
Resolving (…,,, …
Connecting to (||:443… connected.
HTTP request sent, awaiting response… 200 OK
Length: 24499557 (23M) [application/x-tar]
Saving to: ‘/tmp/clang_download.HBMa7j/clang-218707.tgz’

100%[======================================>] 24,499,557 21.5MB/s in 1.1s

2014-11-21 13:13:54 (21.5 MB/s) – ‘/tmp/clang_download.HBMa7j/clang-218707.tgz’ saved [24499557/24499557]

clang 218707 unpacked


4) In case of fetching the code for the first time, document says that I need to do this.



ttf-mscorefonts-installer: downloading
ttf-mscorefonts-installer: downloading
ttf-mscorefonts-installer: downloading
ttf-mscorefonts-installer: downloading…%5DCreating config file /etc/php5/mods-available/readline.ini with new version
php5_invoke: Enable module readline for cgi SAPI
php5_invoke: Enable module readline for cli SAPI
php5_invoke: Enable module readline for apache2 SAPI
Setting up g++-4.6 (4.6.4-6ubuntu2) …
Setting up libmirclient-dev:amd64 (0.1.8+14.04.20140411-0ubuntu1) …
Setting up libegl1-mesa-dev (10.1.3-0ubuntu0.2) …
Setting up libgles2-mesa-dev (10.1.3-0ubuntu0.2) …
Setting up ruby (1: …
Setting up ruby1.9.1 ( …
Setting up libruby1.9.1 ( …
Setting up php5-cgi (5.5.9+dfsg-1ubuntu4.5) …
update-alternatives: using /usr/bin/php5-cgi to provide /usr/bin/php-cgi (php-cgi) in auto mode
update-alternatives: using /usr/lib/cgi-bin/php5 to provide /usr/lib/cgi-bin/php (php-cgi-bin) in auto mode

Creating config file /etc/php5/cgi/php.ini with new version
php5_invoke pdo: already enabled for cgi SAPI
php5_invoke opcache: already enabled for cgi SAPI
php5_invoke json: already enabled for cgi SAPI
php5_invoke readline: already enabled for cgi SAPI
Setting up language-pack-da (1:14.04+20140410) …


Processing triggers for bamfdaemon (0.5.1+14.04.20140409-0ubuntu1) …
Rebuilding /usr/share/applications/bamf-2.index…
Installing Chrome OS fonts.
Installing Chrome OS fonts to /usr/local/share/fonts/chromeos.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 10.1M 100 10.1M 0 0 918k 0 0:00:11 0:00:11 –:–:– 2746k
Installing symbolic links for NaCl.

5) pulling all dependencies : gclient sync

hosung@hosung-Spectre:~/cdot/chromium$ gclient sync
Syncing projects: 100% (80/80), done.________ running ‘/usr/bin/python src/build/’ in ‘/home/hosung/cdot/chromium'[…]________ running ‘/usr/bin/python src/build/gyp_chromium’ in ‘/home/hosung/cdot/chromium’
Updating projects from gyp files…
Hook ‘/usr/bin/python src/build/gyp_chromium’ took 38.06 secs________ running ‘/usr/bin/python src/tools/ –running-as-hook’ in ‘/home/hosung/cdot/chromium’________ running ‘/usr/bin/python src/tools/ src/tools’ in ‘/home/hosung/cdot/chromium’

Up to here, it took 1.5 hours for downloading and pulling all dependencies.
I used wired network in CDOT, which is 3 times faster than wireless(Senenet Extream).
Source code size is 13GB.

6) Build Chromium

instruction :

hosung@hosung-Spectre:~/cdot/chromium/src$ ./build/gyp_chromium
Updating projects from gyp files…hosung@hosung-Spectre:~/cdot/chromium/src$ ninja -C out/Debug chrome
ninja: Entering directory `out/Debug’
[1313/17220] CXX obj/third_party/webrtc/syst…wrappers/source/system_wrappers.rtp_to_nt
[1314/17220] CXX obj/third_party/webrtc/syst…wrappers/source/system_wrappers.trace_imp
[17220/17220] LINK chrome

I started to build using ninja command, which is inside the Depot_tools according to the instruction.

Screenshot from 2014-11-21 16:42:38

One thing I notice is that compiler uses clang and clang++, and it uses 4 cores of my laptop.
Spec of my laptop :

OS : Ubuntu 14.04 LTS
Processor : Intel Core i7-3517U CPU @ 1.90GHz x 4
OS type 64-bit
Memory 7.7GiB
Disk : 117.6GB SSD

I set up clean OS 5 days ago and I installed minimum programs necessary for development.

Build time :

compile 16:39 ~ 20:02 (3h 21m)
link 20:02:56 ~ 20:04:46 (2m)

7) Run Chromium

hosung@hosung-Spectre:~/cdot/chromium/src/out/Debug$ ./chrome
[5215:5215:1121/] Running without the SUID sandbox! See for more information on developing with the sandbox on.
#0 0x7f3a91c4f58e base::debug::StackTrace::StackTrace()
#1 0x7f3a91cc83a5 logging::LogMessage::~LogMessage()
#2 0x7f3a966c1db3 content::(anonymous namespace)::SetupSandbox()
#3 0x7f3a966c1608 content::BrowserMainLoop::EarlyInitialization()
#4 0x7f3a966cd916 content::BrowserMainRunnerImpl::Initialize()
#5 0x7f3a966c073f content::BrowserMain()
#6 0x7f3a91bf42ef content::RunNamedProcessTypeMain()
#7 0x7f3a91bf6d32 content::ContentMainRunnerImpl::Run()
#8 0x7f3a91bf37e5 content::ContentMain()
#9 0x7f3a90a0e835 ChromeMain
#10 0x7f3a90a0e7e2 main
#11 0x7f3a878a9ec5 __libc_start_main
#12 0x7f3a90a0e6c4Aborted (core dumped)

output says that it needs SUID sandbox.

8) Copy Sandbox (about Sandbox )

In case of “Running without the SUID sandbox!”, follow this link’s instruction.
I copied chrome_sandbox from test/reference_build folder.

hosung@hosung-Spectre:~/cdot/chromium/src$ sudo cp chrome/tools/test/reference_build/chrome_linux/chrome_sandbox /usr/local/sbin/chrome-devel-sandbox
hosung@hosung-Spectre:~/cdot/chromium/src$ sudo chown root:root /usr/local/sbin/chrome-devel-sandbox
hosung@hosung-Spectre:~/cdot/chromium/src$ sudo chmod 4755 /usr/local/sbin/chrome-devel-sandbox

Put this in ~/.bashrc
export CHROME_DEVEL_SANDBOX=/usr/local/sbin/chrome-devel-sandbox

9) Run Chromium Debug build

hosung@hosung-Spectre:~/cdot/chromium/src/out/Debug$ ./chrome
ATTENTION: default value of option force_s3tc_enable overridden by environment.
[5535:5535:1121/] SPDY proxy OFF at startup
[5535:5564:1121/] Failed to GetPathForExtension: boadgeojelhgndaghljhdicfkmllpafd
[5535:5564:1121/] Failed to map: chrome-extension://boadgeojelhgndaghljhdicfkmllpafd/cast_sender.js
[5535:5564:1121/] Failed to GetPathForExtension: dliochdbjfkdbacpmhlcpmleaejidimm[…]

Screenshot from 2014-11-21 20:32:14


By the way, this is Linux version. For the android version chromium, I should have download and build android version. Document says:

$ fetch chromium  # Basic checkout for desktop Chromium
$ fetch blink     # Chromium code with Blink checked out to tip-of-tree
$ fetch android   # Chromium checkout for Android platform
$ fetch ios       # Chromium checkout for iOS platform

by Hosung at November 22, 2014 01:51 AM

Omid Djahanpour

Siege HTTP Benchmarking Utility

Siege is an HTTP benchmarking utility as briefly described in one of my prior posts.

The upstream website for Siege is available here. The source is also available in the downloads section, or just click here to download the latest available version of Siege.

Let’s jump right into compiling Siege and using the time command to see how long each step takes.

[odjahanpour@australia siege-3.0.8]$ time ./configure --prefix=$HOME/siege

real 0m11.752s
user 0m4.098s
sys 0m4.275s

[odjahanpour@australia siege-3.0.8]$ time make

real 0m6.492s
user 0m4.904s
sys 0m1.100s

[odjahanpour@australia siege-3.0.8]$ time make install

real 0m0.646s
user 0m0.320s
sys 0m0.185s

We can see that in less than 30 seconds, Siege compiled successfully and is ready to use.

Moving onto some of the source code, I ran egrep to search for any assembler that may be included in the source. The search was successful, and the only thing that it returned was a bit of inline assembler in one file called md5.h.

[odjahanpour@australia siege-3.0.8]$ egrep "__asm__|asm.*\(" -R *
src/md5.h: __asm__("roll %%cl,%0"

Let’s take a closer look at md5.h and see what the inline assembler is used for:

/* The following is from gnupg-1.0.2's cipher/bithelp.h. */
/* Rotate a 32 bit integer by n bytes */
#if defined(__GNUC__) && defined(__i386__)
static inline md5_uint32
rol(md5_uint32 x, int n)
__asm__("roll %%cl,%0"
:"=r" (x)
:"0" (x),"c" (n));
return x;
# define rol(x,n) ( ((x) << (n)) | ((x) >> (32-(n))) )

We can see that the assembler only applies if the CPU architecture type matches i386 which is quite rare these days. One important thing to note though, is the C fall back for all the other architectures.

Using a C fall back is nice as it does not limit the code to specific platforms, however, it can be sometimes beneficial to include inline assembler for a specific platform as the performance increase can be substantial. It is important to note though that it is always best to determine how often the assembly code is being used. For instance, if it is being called on several thousand times, then it might be beneficial to include inline assembler for a specific platform. If though, it is not called on that often, then it might not even make a difference and the C fall back should be sufficient.

Unfortunately, I haven’t done enough research to implement inline assembly for the ARM platform to perform this task, but I will update this post once I have figured out how to do this properly.

If anyone would like to contribute on how to perform the same task in assembler for the ARM platform, please don’t hesitate!

Sadly, I have yet to contact upstream for this package, but I also feel like there may not be a need to do so in this particular case as the program is fairly small and light weight.

My personal experience with this program was that it isn’t really intensive on the system, and it performs its task quickly. It is often limited by the system itself (one of the common errors that I’ve encountered with this program was that there were too many files open, so to alleviate the issue, I would have to change the ulimit for the max number of open files. This would also not be necessary unless using huge numbers for testing).

With some further reading, and implementing inline assembly for ARM, and some testing to see if there is a performance difference, I will be ready to mark this package off on the Linaro Performance site.

by Omid Djahanpour at November 22, 2014 12:46 AM

November 21, 2014

Omid Djahanpour


Let’s take a look at the two packages I have chosen to work with for the Linaro Performance Challenge.


Siege is an http load testing and benchmarking utility. It was designed to let web developers measure their code under duress, to see how it will stand up to load on the internet. Siege supports basic authentication, cookies, HTTP and HTTPS protocols. It lets its user hit a web server with a configurable number of simulated web browsers. Those browsers place the server “under siege.”

I became interested with Siege when I first had to use it for one of my classes this semester (INT525), to benchmark Apache 2.2, Apache 2.4 and Nginx. I thought it would be nice to look at the source to see what’s happening and how it’s actually benchmarking the web server.


PSmisc – Small utilities that use the /proc filesystem

PSmisc was a package that I’ve initially never heard of or encountered before. While browsing through the package list, I saw this package and once I read about it, thought it would be interesting to take a look at. I plan on specializing as a Network Engineer someday, but I do enjoy doing system administration tasks as well and thought PSmisc will eventually be of use to me.

Bundled in this package are four programs that do specific tasks.

  • fuser – identifies what processes are using files.
  • killall – kills a process by its name, similar to a pkill found in some other Unices.
  • pstree – Shows currently running processes in a tree format.
  • peekfd – Peek at file descriptors of running processes.

Now that we have a general idea of what I’ll be working with, I will discuss the packages individually in more details with the tasks I will be performing to optimizing them for ARM as both of these packages have compiled on the ARM system we use with no problems. It is now just a matter of seeing whether the packages can be optimized to perform better.

This is a small edit I’m making. I seem to have completely forgotten that I wasn’t able to compile Siege on x86_64 or ARM as I was missing some dependencies. After fixing that, I found that there was a problem with the source, or at least using fedpkg, I was unable to compile the package as there was a missing file dependency.

I was able to fix this problem by simply downloading the source manually.

by Omid Djahanpour at November 21, 2014 09:47 PM

MIA — What’s to Come

I haven’t updated my blog in a long time as I have been busy with a lot of other work. I had planned on getting things done and blogging about it since last week, however I became quite ill and due to that, I wasn’t able to.

Moving on, in my upcoming blog posts, I will be discussing two open source packages that I will be working on in terms of optimizing for the ARM architecture.

I will be discussing the packages in terms of what they are, and posting some code snippets from the source.

I hope to have a few more posts in by tonight. I will be regularly updating my blog from now on so stay tuned.

by Omid Djahanpour at November 21, 2014 09:30 PM

Linpei Fan

Grunt vs. Gulp

Grunt and Gulp are the popular task runner in Node.js projects. In release 0.1, I have used Grunt in Filer project. And webmaker project uses Gulp indead. I did some research on their similarities and differences.

Both Grunt and Gulp are to build the process in Node.js base. But they have different mechanism. Grunt has longer history and Gulp is new.

Both Grunt and Gulp relies on the plug-ins to build tasks. They have large plug-in base. Developers rarely need to developer own building tasks. Grunt has better community support than Gulp since it has longer history. Grunt plug-ins often perform multiple tasks; Gulp plug-ins are designed to do one thing only.  

Mechanism in building process:
Grunt needs to generate intermediary files (.tmp/) to disk during the building process. It uses declarative approach to build tasks. When the build flow is large, it will be hard to figure out the task execution order. Also, the development team needs to write the maintenance code.
Gulp is a streaming building system. Streams is the most important concept of Gulp, which means that you input your files into a pipe from one end and generate the output file from the other end without any interruptions in the middle of the process. It makes your task definitions a bit easier to ready. And it does not have the disk I/O issues since it does not have the intermediary file written to the disk. However, Gulp requires developers know Node well to deal with streams, pipes and asynchronous code.

Gruntfile.js vs. Gulpfile.js
Grunt uses JSON-like data configuration files. And Gulp use Javascript code. There are examples getting from Filer project and Webmaker project.


Both of them have advantages and disadvantages. Developers would consider them to choose the best suitable one. As a beginner in open source community, it is good to learn both of them to help understand the system I am working on.


by Lily Fan ( at November 21, 2014 04:14 PM

Release 0.3

I chose to stay in webmaker project for my release 0.3 project since I have worked on this before and am interested in this project. I picked up the issue #498 and then worked on it for several days. I noticed that this project uses page.js to direct routing. It took me some time to understand how page.js works, but finally found out page.js does not have the functionality to refresh the page. Then I used location.reload() to do so. After doing release 0.3, I got to know more about webmaker My pull request is here

by Lily Fan ( at November 21, 2014 06:44 AM

Hosung Hwang

CordovaStabilizer – Browser Engine Research

In this project, the biggest challenge is substituting WebView of each Mobile OS (Android, iOS, Blackberry, Windows Mobile). To do that, understanding browser engine(rendering engine) is important.

List of Web Browsers ;

Comparison of Web browser engine :

Blink :

Webkit :

According to these links, most mobile browsers are based on Webkit : Safari, iOS browser, Chrome, Android Browser. Blackberry browser also based on Webkit (source).

However, Chrome and Android Browser’s rendering engine Blink is fork of WebCore, which is part of Webkit.

What we need to figure out is which engine or part of browser we need to use between Webkit, Blink, Chrome.

Firstly, I tried to download Webkit source code from

hosung@hosung-Spectre:~/cdot$ git clone git:// WebKit
Cloning into ‘WebKit’…
remote: Counting objects: 2644198, done.
remote: Compressing objects: 100% (462438/462438), done.
Receiving objects: 100% (2644198/2644198), 5.11 GiB | 15.47 MiB/s, done.s
remote: Total 2644198 (delta 2106864), reused 2633860 (delta 2097718)
Resolving deltas: 100% (2106864/2106864), done.
Checking connectivity… done.
Checking out files: 100% (171371/171371), done.

Cloning from git took around 1 hour. 6.6GB.
Next step is building.

by Hosung at November 21, 2014 06:07 AM

CordovaStabilizer – Build Cordova Android 4

Yesterday, deploying cordova test app to my phone using cordova-3.7.0-dev.jar didn’t work.
Today, I test it again.

Adnroid 4.4.2 Emulator – success
Android 4.0.4 Motorolla phone – success
Android 4.4.2 GalaxyS 4 – success

For the GalaxyS 4, I uninstalled yesterday’s app.
And then, successfully launched.
That is probably because of the app that contains jar file built for Android Wear.

For this test, I used Android Developer Tools from
Created new android project from existing code, and selected the test project folder.

And then In the Properties of this project, I added cordova-3.7.0-dev.jar file in “Java Build Path” tab.

by Hosung at November 21, 2014 12:03 AM

Hunter Jansen



Written by Hunter Jansen on November 21, 2014

Hi there, today’s entry is about my work on Bowtie, An ultrafast memory-efficient short read aligner for the Linaro Performance challenge as I continue to try to find a package to port from x86 to arm64.


Last time out, I covered zeromq and how the work to port that over from x86 to arm64 was already done - making five packages in a row in which porting work was already done or rendered unnecessary due to codebase changes. I decided to move on to package randomly chosen for its name: Bowtie (because bowties are cool).

Bowtie also appears to be pretty popular in the space it’s in, but actually receives pretty sparse updates on it’s github page (every other week or so). I was hopeful that this project would be the one I would get to port over to arm64.

Alas, it looks like it’s already done as well :(

Getting Started

So the first step I’ve taken to trying (since I’ve had to go through so many packages by this point) is to simply install via yum.

sudo yum install bowtie

And, as I tried this on the arm machine I see an aarch64 package roll in. Sixth time is indeed not the charm, it seems. But I’m determined to do my due diligence to make sure everything works just fine, especially considering the sparse history of commits. So first I pulled the package in with fedpkg and prepped it. From there I ran the make file (Interestingly enough, no config needed).

fedpkg clone -a bowtie
cd bowtie
fedpkg prep
cd bowtie-1.0.1

And the make doesn’t work on the arm64 system…why not? After some poking around, it turns out I was receiving a nondescript error during the gcc phase of the make script. There’s no error message and nothing indicating why it fails, so then I had to look at the compile call and see what’s happening.

The compile command looks like:

    g++ -O3 -m64 -DCOMPILER_OPTIONS="\"-O3 -m64 -Wl,--hash-style=both -DPOPCNT_CAPABILITY  \""  -Wl,--hash-style=both -DPOPCNT_CAPABILITY    \
    -I SeqAn-1.1 -I third_party -I third_party\
    -o bowtie-build ebwt_build.cpp \
    ccnt_lut.cpp ref_read.cpp alphabet.cpp shmem.cpp edit.cpp ebwt.cpp tinythread.cpp  bowtie_build_main.cpp \

There’s a bunch of stuff happening there, but after some investigation, it looks like the first clue to the puzzle is obvious. The -m64 flag is an x86 only for gcc, so that would definitely cause some problems. However, after removing that part from the make file, things still aren’t working. FURTHER INVESTIGATION REQUIRED!


This post ended up being so much more eventful than I expected and it looks like I might actually have a project to work on. Next time, I’ll continue investigating!

Tune in next time!

November 21, 2014 12:00 AM

November 20, 2014

Ava Dacayo

Release 0.3 – Translating Mozilla Appmaker to Filipino

Since I paused on working on the double media player issue, I decided to work on something else for the meantime. My Release 0.2 was about the localization for the sign up page of webmaker-app and to test that it does change languages, I switched from English to French because it already has its own translations. My first instinct though was to test it in my language – which was not available. So I got the idea of translating stuff instead for release 0.3.

I thought it would be easier than expected but I ended up googling some words because of lack of translation and/or because I couldn’t think of the best word to describe it. At first I started translating in tl_ph (Tagalog Philippines) but decided to switch in Filipino because I converted some of the words mixed with English which is how people normally speak. Most would say that the two languages are exactly the same, even native speakers would, but technically Fiilipino has more borrowed words. Anyway, there is an entirely separate discussion about that which is not part of this blog – I did however translated in a more understandable way.


100 strings, 725 words

I was surprised I managed to translate 725 words within those 100 strings! Is the counter working properly? lol. But just to make sure I am on the right track, I looked at page and saw that it was mixed with English as well:

google filipino “Mag-sign in

transifex version that I translated

The string I translated in Transifex

Note: I am not 100% sure if translations need to get approved first before it shows up. I’ll probably check in a few days.

by eyvadac at November 20, 2014 10:40 PM

Lukas Blakk (lsblakk)

Artisanal Contributors

Part 1: Start In Person

Ascend had very few ‘rules’ but there was one which was non-negotiable: it’s an in-person program. We didn’t do distance learning, online coursework, or video-based classes. We did bring in a couple of speakers virtually to speak to the room of 20 participants but the opposite was never true.

This was super important in how we were going to build a strong cohort. Don’t get me wrong, I’m a fan of remote work and global contribution as well as with people working from wherever they are. This was a 6 week intensive program though and in order to build the inter-dependent cohort I was hoping to1, it had to be in person at first. Those cruicial early stages where someone is more likely to ‘disappear’ if things were hard, confusing, or if they couldn’t get someone’s attention to ask a question.

It’s been over 5 years since I graduated from my software development program and over 8 years since I started lurking in IRC channels2 and getting to know Mozillians in digital space first. I wouldn’t have stuck with it, or gotten so deeply involved without my coursework with Dave Humphrey though. That was a once a week class, but it meant the world to be in the same room as other people who were learning and struggling with the same or similar problems. It was an all-important thread connecting what I was trying to do in my self-directed time with actual people who could show more caring about me and my ability to participate.

Even as an experienced open source contributor I can jump into IRC channels for projects I’m trying to work on – most recently dd-wrt for my home server setup – and when I ask a question (with lots of evidence for what I’ve already tried and an awareness of what the manual has to say) I get no response, aka: Crickets. There are a host of reasons, and I know more than a beginner might about what those could be: timezones, family comitments, no one with the expertise currently in the channel, and more. None of that matters when you’re new to this type of environment. Silence is interpreted as a big “GO AWAY YOU DON’T BELONG HERE” despite the best intentions of any community.

In person learning is the best way to counter that. Being able to turn to a colleague or a mentor and say what’s happening helps get you both reassurance that it’s not you, but also someone who can help you get unstuck on what to do next. While you wait for a response, check out this other topic we’re studying. Perhaps you can try other methods of communication too, like in a bug or an email.

Over the course of our first pilot I also discovered that removing myself from the primary workroom the Ascend participants were in helped the cohort to rapidly built up strengths in helping each other first3. The workflow looked more like: have a question/problem, ask a cohort member (or several), if you still can’t figure it out ask on IRC, and if then if you’re still stuck find your course leader. This put me at the end of the escalation path4 and meant that people were learning to rely both on in-person communications as well as IRC but more importantly were building up the muscle of “don’t stop asking for help until you get it” which is really where open source becomes such a great space to work in.

Back to my recent dd-wrt experience, I didn’t hear anything back in IRC and I felt I had exhausted the forums & wikis their community provided. I started asking in other IRC channels where tech-minded people hung out (thanks womenwhohack!) and then I tried yet another search with slightly different terms. In the end I found what I needed in a YouTube tutorial. I hope that sufficiently demonstrates that a combination of tactics are what culminate in an ability to be persistent when learning in open source projects.

Never underestimate the importance of removing isolation for new contributors to a project. In person help, even just at first, can be huge.

  1. Because the ultimate goal of Ascend was to give people skills for long-term contribution and participation and a local cohort of support and fellow learners seemed like a good bet for that to be possible once the barrier-removing help of the 6 week intensive was no longer in place. 
  2. By the way, I’m such a huge fan of IRC that I wrote the tutorial for it at Mozilla in order to help get more non-engineering folks using it, in my perfect world everyone is in IRC all the time with scrollback options and logging. 
  3. Only after the first three weeks when we moved to the more independent work, working on bugs, stage. 
  4. Which is awesome because I was always struggling to keep up with the course creation as we were running it, I didn’t realize that teaching 9-5 was asking for disaster and next time we’ll do 10-4 for the participants to give the mentors pre and post prep time. 

by Lukas at November 20, 2014 10:06 PM

Ava Dacayo

Release 0.3 – Bug search – catching up on blogging part 2

This issue is about the media player in appmaker showing the video twice. After playing around a bit I found out that this does not happen if you simply use the default url when the player is created, it shows two videos when you change the url and save it. When it loads, it becomes like the screenshot below. It also happens when you change the url and then duplicate it.

Screenshot (100)

I didn’t look at the issue for a few days and when I pulled the newest codes… Actually I don’t remember what I did but anyway, I couldn’t run it  so I posted on the issue page what my current status is. The last time I was able to run it, I was using Chrome’s Developer Tools, added some breakpoints and from what I observed (not 100% sure though, this is just what I remember) is that it is creating the media player 2x. I’ll look into this again soon and try figure out what’s happening.

by eyvadac at November 20, 2014 10:00 PM

Lukas Blakk (lsblakk)

Release Management Tooling: Past, Present, and Future

Release Management Tooling: Past, Present, and Future

As I was interviewing a potential intern for the summer of 2015 I realized I had outlined all our major tools and what the next enhancement for each could be but that this wasn’t well documented anywhere else yet.

By coming to Release Management from my beginnings as a Release Engineer, I’ve been a part of seeing our overall release automation improve across the whole spectrum of what it takes to put out packaged software for multiple platforms and we’ve come a long way so this post is also intended to capture how the main tools we use have gotten to their current state as well as share where they are heading.


Past: Release Manager on point for a release sent an email to the Release-Drivers mailing list with an hg changeset, a version, build number, and this was the “go” to build for Release Engineering to take over and execute a combination of automated/manual steps (there was even a time when it was only said in IRC, email became the constant when Joduinn pushed for consistency and a traceable trail of events). Release Engineers would update a config files & locale changes, get them attached to a bug, approved, uplifted, then go reconfigure the build machines so they could kick off the release build automation.

Present: Ship-It is an app developed by Release Engineering (bhearsum) that allows a Release Manager to input the configurations needed (changeset, version, build number, partials to be created, l10n changesets) all in one place, and on submit the build automation picks up this change from a db, reconfigures the build machine, and triggers builds. When all goes well, there are zero human hands between the “go” and the availability of builds to QA.

Future: In two parts:
1. To have a simple app that can take a list of bug numbers and check them for landing to {branch} (where branch is Beta, Release, or ESR), once all the bug numbers listed have landed, check tree herder for green status on that last changeset, submit to Ship-It if builds are successful. Benefits: hands off even sooner, knowing that all the important fixes are on the branch in question, and that the tree is totally green prior to build (sometimes we “go” without all the results because of human timing needs).
2. Complete End-To-End Release Checklist, dynamically updated to show what stage a release job is at and who’s got the ball in their court. This should track from buglist added (for the final landings a RM is waiting on) all the way until the release notes are live and QA signs off on updates for the general release being in the wild.

Nucleus (aka Release Note App)

Past: Oh dear, you probably don’t even want to know how our release notes used to be made. It’s worse than sausage. There was a sqlite db file, a script that pulled from that db and generated html based on templates and then the Release Manager had to manually re-order the html to get the desired appearance on final pages, all this was then committed to SVN and with that comes the power to completely break properties. Fun stuff. Really. Also once Release Management was more than just one person we shared this sqlite db over Dropbox which had some fun quirks, like clobbering your changes if two people had the file open at the same time. Nowhere to go but up from here!

Present: Thanks to the web production team (jgmize, hoosteeno, craigcook, jbertsch) we got a new Django app in place that gives us a proper databse that’s redundant, production quality, and not in our hands. We add in release notes as well as releases and can publish notes to both staging and production without any more commits to SVN. There’s also an API that can be scripted to.

Future: The future’s so bright in this area, let me get my shades. We have a flag in Bugzilla for relnote-firefox where it can get set to ? when something is nominated and then when we decide to take on that bug as a release note we can set it to {versionNum}+. With a little tweaking on the Bugzilla side of things we could either have a dedicated field for “release-note text” or we could parse it out of a syntax in a comment (though that’s gonna be more prone to user error, so I prefer the former) and then automatically grab all the release notes for a version, create the release in Nucleus, add the notes, publish to staging, and email the link around for feedback without any manual interference. This also means we can dynamically adjust release notes using Bugzilla (and yes, this will need to be really cautiously done), and it makes sure that our recent convention of having every release note connect to a bug persist and become the standard.

Release Dash

Past: Our only way to visualize the work we were doing was a spreadsheet, and graphs generated from it, of how many crasher bugs were tracked for a version, how many bugs tracked/fixed over the course of 18 weeks for a version, and not much else. We also pay attention to the crash rate at ship time, whether we had to do a dot release or chemspill, and any other release-version-specific issues are sort of lost in the fray after we’re a couple of weeks out from a release. This means we don’t have a great sense of our own history, what we’re doing that works in generating a more stable/successful release, and whether a release is in fact ready to go out the door. It’s a gamble, and we take it every 6 weeks.

Present: We have in place a dashboard that is supposed to allow us to view the current crash data, select Talos (performance) data, custom bug queries, and be able to compare a current release coming down the pipe to previous releases. We do not use this dashboard yet because it’s been a side project for the past year and a half, primarily being created and improved upon by fabulous – yet short-term – interns at Mozilla. The dashboard relies on Elastic Search for Bugzilla data and the cluster it points to is not always up. The dash is written in php and that’s no one’s strong suit on our current team, our last intern did his work by creating a Python Flask app that would work into the current dash. The present situation is basically: we need to work on this.

Future: In the future, this dashboard will be robust, reliable, production-quality (and supported), and it will be able to go up on Mozilla office screens in the dashboard rotation where it will make clear to any viewer:
* Where we are in the current release cycle
* What blockers remain for releas
* How our stability is (over/under acceptable rates)
* If we’re meeting performance expectations
And hopefully more. We have to find more ways to get visibility into issues a release might hit once it’s with the larger population. I’d love to see us get more of our Beta user’s feedback by asking for it on specific features/fixes, get a broader Beta audience that is more reflective of our overall release population (by hardware, location, language, user types) and then grow their ability to report issues well. Then we can find ways to get that front and center too – including to developers because they are great at confirming if something unusual is happening.

What Else?

Well, we used to have an automated script that reminded teams of their open & tracked bugs on Beta/Aurora/Nightly in order to provide a priority order that was visible to devs & their managers. It’s a finicky script that breaks often. I’d like to see that replaced with something that’s not just a cronjob on my personal VPS. We’re also this close to not needed to update product-details (still in SVN) on every release. The fact that the Release Management team has the ability to accidentally take down all properties when a mistake is made submitting svn propedits is not desireable or necessary. We should get the heck away from that asap.

We’ll have more discussions of this in Portland, especially with the teams we work closely with and Sylvestre and I will be talking up our process & future goals at FOSDEM in 2015 as well as following it with a work week in Paris where we can put our heads down and code. Next summer we get an intern again and so we’ll have another set of skilled hands to put on tooling & web service improvements.

Always improving. Always automating. These are the things that make me excited for the next year of Release Management.

by Lukas at November 20, 2014 07:53 PM

Hosung Hwang

CordovaStabilizer – Build Cordova Android 3

I tried to build and run test project inside Cordova-Android using the jar file.
I installed “Android 4.4W (API20)” SDK through “Android SDK Manager”.
It takes time… long time.

And added the Cordova jar file to the test project

When I build and try to launch on my Phone.
It crached.

So I made Emulator for this “Android 4.4W”
It crached again.
The problem was that this “Android-20 4.4W”was for Android Wear (watch etc..)
2~3 years ago when I was developing Android apps, the version was 2.2 ~ 3.x, I haven’t seen this “W” things.
By the way, this project or cordova seems not to work at Android Wear.

I have no idea why only this Android Wear SDK is installed with ADK.
Maybe to promote to development of Android Wear, whatever.

Anyway, installed API19 – Android 4.4.2, which is the same version as my GalaxyS 4 phone.

Now let’s build Cordova jar file for API19.

hosung@hosung-Spectre:~/cdot/cordova-android/framework$ android update project -p . -t android-19
build.xml: Found version-tag: custom. File will not be updated.
Updated file ./proguard-project.txt
It seems that there are sub-projects. If you want to update them
please use the –subprojects parameter.

Something is wrong, but there is very kind message.

hosung@hosung-Spectre:~/cdot/cordova-android/framework$ android update project -p . -t android-19 –subprojects
build.xml: Found version-tag: custom. File will not be updated.
Updated file ./proguard-project.txt
Updated and renamed to
No project name specified, using project folder name ‘bin’.
If you wish to change it, edit the first line of build.xml.
Added file ./bin/build.xml
Added file ./bin/proguard-project.txt

OK. Let’s build

/cordova-android/framework$ ant jar
Buildfile: /home/hosung/cdot/cordova-android/framework/build.xml

[javac] Compiling 1 source file to /home/hosung/cdot/cordova-android/framework/bin/classes
[echo] Creating library output jar file…
[jar] Building jar: /home/hosung/cdot/cordova-android/framework/cordova-3.7.0-dev.jar



Using this jar file and for the 4.4.2 target, built successfully. However, it still crashes while running in the device.
Something was wrong.

by Hosung at November 20, 2014 12:13 AM

November 19, 2014

Hosung Hwang

CordovaStabilizer – Build Cordova Android 2

1. Set android tools path.

In my case, added below in .bashrc

export ANDROID_HOME=/home/hosung/development/adt-bundle-linux-x86_64-20140702/sdk
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export PATH=$PATH:~/sh:$ANDROID_HOME/tools:$ANDROID_HOME/platform-tools:$JAVA_HOME

2. Try build says :
To create your `cordova.jar` file, run in the framework directory:
android update project -p . -t android-19
ant jar

(However, in my machine)

/cordova-android/framework$ android update project -p . -t android-19
Error: Target id ‘android-19′ is not valid. Use ‘android list targets’ to get the target ids.

(So, I checked target ids)

/cordova-android/framework$ android list targets
Available Android targets:
id: 1 or “android-20″
Name: Android 4.4W
Type: Platform
API level: 20
Revision: 1
Skins: WVGA854, WVGA800 (default), WXGA800-7in, WXGA800, HVGA, WQVGA432, WXGA720, WQVGA400, WSVGA, QVGA
Tag/ABIs : no ABIs.

(And tried this )

/cordova-android/framework$ android update project -p . -t android-20
build.xml: Found version-tag: custom. File will not be updated.
Added file ./proguard-project.txt

(Great. Let’s build  )

/cordova-android/framework$ ant jar
Buildfile: /home/hosung/cdot/cordova-android/framework/build.xml


[checkenv] Android SDK Tools Revision 23.0.2
[checkenv] Installed at /home/hosung/development/adt-bundle-linux-x86_64-20140702/sdk

[echo] Project Name: Cordova
[gettype] Project Type: Android Library

[getbuildtools] Using latest Build Tools: 20.0.0
[echo] Resolving Build Target for Cordova…
[gettarget] Project Target: Android 4.4W
[gettarget] API level: 20

[mergemanifest] Merging AndroidManifest files into one.
[mergemanifest] Manifest merger disabled. Using project manifest only.

/home/hosung/development/adt-bundle-linux-x86_64-20140702/sdk/tools/ant/build.xml:653: The following error occurred while executing this line:
/home/hosung/development/adt-bundle-linux-x86_64-20140702/sdk/tools/ant/build.xml:698: Execute failed: Cannot run program “/home/hosung/development/adt-bundle-linux-x86_64-20140702/sdk/build-tools/android-4.4W/aapt”: error=2, No such file or directory

3. Solve problem

Problem is that my OS is 64bit Ubuntu 14.04LTS. This guy needs some 32bit libraries.

Solution is :
sudo apt-get update
sudo apt-get install gcc-multilib lib32z1 lib32stdc++6

source :

4. build again (full build output)

/cordova-android/framework$ ant jar
Buildfile: /home/hosung/cdot/cordova-android/framework/build.xml


[checkenv] Android SDK Tools Revision 23.0.2
[checkenv] Installed at /home/hosung/development/adt-bundle-linux-x86_64-20140702/sdk

[echo] Project Name: Cordova
[gettype] Project Type: Android Library

[getbuildtools] Using latest Build Tools: 20.0.0
[echo] Resolving Build Target for Cordova…
[gettarget] Project Target: Android 4.4W
[gettarget] API level: 20
[echo] ———-
[echo] Creating output directories if needed…
[mkdir] Created dir: /home/hosung/cdot/cordova-android/framework/bin/rsObj
[mkdir] Created dir: /home/hosung/cdot/cordova-android/framework/bin/rsLibs
[echo] ———-
[echo] Resolving Dependencies for Cordova…
[dependency] Library dependencies:
[dependency] No Libraries
[dependency] ——————
[echo] ———-
[echo] Building Libraries with ‘${}’…
[subant] No sub-builds to iterate on

[mergemanifest] No changes in the AndroidManifest files.
[echo] Handling aidl files…
[aidl] No AIDL files to compile.
[echo] ———-
[echo] Handling RenderScript files…
[echo] ———-
[echo] Handling Resources…
[aapt] Generating resource IDs…
[echo] ———-
[echo] Handling BuildConfig class…
[buildconfig] Generating BuildConfig class.


[javac] Compiling 94 source files to /home/hosung/cdot/cordova-android/framework/bin/classes
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[echo] Creating library output jar file…
[jar] Building jar: /home/hosung/cdot/cordova-android/framework/bin/classes.jar

[jar] Building jar: /home/hosung/cdot/cordova-android/framework/cordova-3.7.0-dev.jar


5. Check output files


by Hosung at November 19, 2014 09:55 PM

Cordova VS PhoneGap

History of Cordova and Phonegap according to this blog,
1) Nitobi made PhoneGap and donated to ASF(Apache Software Foundation)
2) Adobe bought PhoneGap from Nitobi
3) Apache renamed the open source project to Cordova.

Cordova is core engine.
PhoneGap is Adobe’s distribution of Cordova.

In case of using Adobe’s service, use PhoneGap.
Otherwise, use Cordova.

by Hosung at November 19, 2014 08:03 PM

Linpei Fan

Strategies benefiting from open source communities

On OSD600 class on this Monday, we had a discussion on how to build a software production pipeline that can strategically benefit from a vibrant open source community from many aspects, such as big corporations, small shops, students and so on.

We had a lot of ideas, which can be categorized into following aspects:
  • Face to face gatherings: since the open source developers are from all of the world, it’s important and excited for them to have chances to gather together and have the face to face communication, such conferences, workshops and so on. They can increase participants’ enthusiasm to the open source community.
  • Standards: good standards make participants easy to understand and follow, and keep workflow organized in open source communities.
  • Technology: using programming languages that is newly developed.
  • Onboarding: making it easier to understand for non-developer.
  • Time management: making miles stones.
  • Project planning: doing marketing research to find users and localization staff, use feedback to improve the project.
  • Funding: big companies would sponsor for events held in open source community; hiring contributors.
  • Partner : encouraging partnerships with business and educational organization, partnering with educational organization to have more targeting users and participants for the project
  • Building open community: encouraging social media, use permissive license; putting on version control system, such as github or subvision; recognizing contributors, mentoring the community.
  • Documentation: creating good guide and documentations for new developers, making video/audio guide for different learners.

by Lily Fan ( at November 19, 2014 07:48 PM

Gary Deng

Jquery UI Sortable Placeholder Issue in Release 0.3

I am stuck in  a bug in Mozilla Appmaker project. When user drags and drops a brick, in line 12, it will insert a highlighted placeholder to show its target position where the brick would be dropped. The following function implements the drag and drop feature:

          function createSortable() {
            var draggedOff = false;

              distance : 10,
              handle: '.handle',
              appendTo : "body",
              helper : 'clone',
              connectWith : ".ceci-card",
              placeholder: {
                element: function(clone, ui) {
                  return $("<div class='sortable-placeholder'></div>");
                update: function() {
              start : function(){
                that.async(function() {
                  window.dispatchEvent(new CustomEvent('CeciElementSortStarted', {bubbles: true}));
              stop: function(ev, ui) {
                that.async(function() {
                  document.dispatchEvent(new CustomEvent('CeciElementsSorted', {bubbles: true, detail: ui.item[0]}));

It works fine in one single card; However, if you have 3 cards or more, and you want to drop the brick from card a (page a) to card b (page b). Before you drop the brick, the placeholder isn’t attached to card b as expected. It seems that the placeholder would randomly be attached to one of the available cards (a, b, or c). That’s a big problem as user couldn’t control the drop position. If you really want to drop it to card b for example, you have to hover card b tab, then hover card b phone container, the placeholder would show up on card b, after that you can drop the brick to card b. I am still working on this issue, and I hope this bug is not caused by JQuery UI library. Otherwise, that’s really time consuming.

by garybbb at November 19, 2014 02:12 PM

Hosung Hwang

SSH connection without entering password

1. Create private/public key pair in your machine

If there is ~/.ssh/
(goto 2)
(type) $ ssh-keygen
(enter 3 times)

2. Copy public key to remote machine

local> scp .ssh/

3. Connect remote machine

local> ssh
(type password)

4. Add public key to Authorized key list

remote> cat >> .ssh/authorized_keys
remote> rm
remote> exit

If adding to authorized_keys doesn’t work, your account probably do not have permission.

5. Try connection

local> ssh

by Hosung at November 19, 2014 05:27 AM

CordovaStabilizer – Build Cordova Android 1

Apache Cordova is a platform for building native mobile apps using HTML, CSS and Javascript. It is the base of Adobe PhoneGap.

1. Install Cordova in Ubuntu 14.04LTS

source :

Install NPM
$ sudo apt-get install npm

Install Git
$ sudo apt-get install git

Install Apache Cordova
$ sudo npm install -g cordova

check the version
$ cordova -v
> 4.1.2

In my case “/usr/bin/env: node: No such file or directory” produced. solution is
$ sudo apt-get install nodejs-legacy

2. Clone Cordova-android

Cordova-android is an Android library to build Cordova based applications.

Git Repository :
$ git clone

or you can fork it to your git repository by clicking “Fork” and clone it to local machine. In my case
$ git clone

by Hosung at November 19, 2014 03:37 AM

November 18, 2014

Marcus Saad

Front in Maceió 2014

The event

Taking place at the gorgeous city of Maceió, I’ll go ahead and say that I’ve never been to such a wonderful place. The event was organized by caravanaweb, Juarez Filho and Ítalo Waxman. I have nothing but kind words for all the people who helped us with transportation, accommodations and food, you guys were great.

The main idea behind the event was to bring professionals in several topics, ranging from Web, Dev, Entrepreneurship and Innovation. If you consider what our presenters brought to the event, there was enough content and knowledge being disseminated so that people could evolve their ideas and start their own business.


We arrived on Wednesday 12th somewhere between 02:00 and 03:00 AM – that being due to an 1 hour difference between São Paulo and Alagoas [there is no Day light saving time there]- and went straight ahead to our Hostel. Already feeling tired from the trip and a few buses each of us had to ride to get to the airport, we crashed on our beds and slept a couple hours soon to wake up. When the clock ticked 7:30 AM, we were already up and running for our first day of the Hackathon. The starting time was set to 9AM and it would take place at IFAL (Federal Institute of Alagoas). Accordingly to event organizers, we were expecting around 400 people to join us and get our hands dirty on Firefox OS apps.

Somehow, something went terribly wrong (I’ll give my insight at the end) and we ended up with a room filled with nothing more than 15 people to be honest. Rafael, who is also a Mozillian and joined us in the event, went to another building, where he had not more than 10 interested developers, aging from 15~18 years.

Rafael at SESI/SENAI Melissa and I at IFAL

Bad Estimations apart,  the hackathon was set to last for two days, starting at 9AM and ending at 5PM with an one hour lunch interval. At first, Mel and I started to get to know the people we were working with, talking with them to understand their reality and motivations. After that, we moved to a more technical conversation to check what each of them were capable of doing.  All the planning took around 60 minutes, starting with a get to know each other and ending with a what will your app do/what problem you’re trying to fix.

 We ended up with two different apps at IFAL

App 1
  • Concept
    • They are college students and their classes suffer from room changing a lot, often times they don’t have classes or need to share a message with everyone in the same class. Think of it as Post-it blackboard. You login to your class, and share a message with your classmates.
  • Implementation
    • The app consists of three resources: Users, Classes, Publications. A user can be part of several classes, and publications belongs to a single class.
    • Front end made with jQuery and a few other libs
    • Back end developed by me using Django and Django REST Framework to provide an API to which they could consume data, thus speeding up the development time.
  • Links
    • Soon to come, they are still on a private repo on github.
App 2
  • Concept
    • Everybody wants to know what is happening on the city. This app wants to aggregate information about events happening at the same city, providing a way to subscribe to it, to find more info and contact organizers.
  • Implementation
    • The app consists of a single resource: Event.
    • Front end made with Angular.js
    • Back end developed by me using Django and Django REST Framework to provide an API to which they could consume data, thus speeding up the development time.
  • Links
    • Soon to come, they are still on a private repo on github.

I know we had two more apps, but I can’t go any further deep about them since I haven’t got to see them in detail.

Hackathon winnersHackathon winners

Front in, the event

On Saturday the 15th, the most intense part of it happened. We were gathered at huge area properly built for events (despite the lack of internet and air conditioner outside of the main areas [much appreciated when you're dealing with 32C]). Concomitantly, an event called Nerd Legends was taking place at the same location, which brought a lot of young people to it. It was nice to see all types of people in a tech/gamer event.

The event had planned to start with their first talk at 9 AM and their last presenter was set to go up in stage at 7PM. It was a hell of a marathon for people watching and participating the event, but a well worth marathon. Divided into two tracks, learning and coding, attendees had the choice to pick which talk fitted their needs the most.

We, from Mozilla had both talks running at the same time, side by side a large theater like room. You may think that it was a bad idea to have them happening at the same time, and believe that we also did. However, it all came out well, mainly because we targeted two different crowds. In the learning track, Panaggio and Melissa were presenting an engagement talk, explaining what it is like to be a Mozillian and how people could join us. They did an amazing job that can already be seen in our community list.

Panaggio and Melissa - Join us!Panaggio and Melissa – Join us!

On the other room, I was talking about Firefox OS and how people could develop to this new platform that Mozilla has put its efforts during the last couple years. I was mesmerized to be honest. My room was packed with people, everyone was laughing of my terrible jokes and my honesty towards some of our devices weaknesses. I felt that I had the crowd following me and enjoying the time we had there together.

I’m kinda sad that it ended up so quickly (they only gave me 40 min to talk), mainly because I’m used to having at least one hour. It’s a very dense, rich and codey presentation [yet very fun].

Unfortunately nobody bothered taking pics of me, so I won’t have pics to prove it.


@Thanks to everyone who attended my talk, I feel special. I’m just a simple guy who does what he loves, and you people, make it happen. I’m happy to have met each and everyone of you, to exchange a few words and to get to know your reality. Mozilla is all about sharing, making and loving what you do.

Post-Event / Games Night

After all presentations were over, we promised everyone that we were hosting a Games Night. Our idea was to make everyone get together, cheer for their team and if they succeed, they would win a nice prize.

Panaggio and Mel had already organized this activity in other events, so they had pretty much everything already planned and ready to go!

Our first game was a Key-Value Bingo. We would read a concept and if they had the name of that concept on their board, they would check it. The first team to fill their board, scores one point.

Second game was a memory game with a whole bunch of stickers we had. People had to pick two numbers and match the stickers. The team with the most number of matches, scored.

Third game was a Mimic/Draw for your team. We would randomly select a word (Firefox, Browser, Marketplace, Gecko, etc..) and they had to make their team guess what the word was. This was by far the most exciting one.

The last game to settle the winner of the night was a copy of a TV show we used to have in Brazil, which presenter I can impersonate quite accurately. People laughed a lot and had fun, oh they had.

Games NightGames Night



Oh dear.. I wish I could remember everyone to whom I talked and had a good time, but I’ll give my best

Igo Lapa – Dude, you saved our lives. Thanks for every ride, for taking us to amazing places and for being such a good host. We loved meeting you and your family. We hope to see you agian

Juarez, Gustavo – Thanks for arranging things up for us, for making us laugh and having a good time.

Mel, Panaggio, Rafael, Konstantina, Guilermo and everyone at Moz – Without you guys, this event would be way different. Thanks for organizing things in our end [mozilla]. I know how hard and time consuming it is.

People who watched our talks and joined the hackathon – You rock!


See you next time, somewhere in this country. ~~~~~~~<3


Member Acknowledgement of the month

I would like to congratulate Eliezer Bernart(@elizerb) for his events down in the South (to which I helped with swag request). He was able to put together an extremely fun and creative material to talk about Cross Platform apps based on Firefox OS. It has ice cream, power rangers, monsters and much more. If you want to follow up with his endeavors, you may find blog posts about it here and here. Warning: It’s in pt-BR.


by msaad at November 18, 2014 07:05 PM

Kieran Sedgwick

[SPO600] Investigating LAME

Since my first attempt at porting a package culminated in a dead end, I picked up the slack this weekend by investigating the LAME audio codec. This is just a brief overview of my findings so far.


The LAME community seems sporadically active. The official site lists the last release as happening in 2011, but investigating the release notes from the CVS tree shows releases as recent as late 2012. Likewise, the SourceForge issue tracker shows bugs filed as recently as earlier this year, with patches having been submitted earlier this summer.

Having said this, I didn’t find evidence that the core contributors had given LAME much attention for some time. There was an exception to this, so it seems the project does have some life. I’ll be reaching out to the main contributors in the next few days after completing my preliminary research.

ASM code in the Repo

This library relies on assembly in a number of different ways. Where Intel chips are concerned, it takes advantage of NASM where possible to speed up performance. Otherwise, it communicates with the FPU through Intel-specific assembly code in its shared utility file. I’m still investigating, but among other things it disables floating point exceptions.

Intel Compiliation

Compiling on Australia was straight-forward, as was compiling on the Macbook Pro I use. I plan to profile the library in Tuesday’s class.

ARM64 Compilation

As with my previous package, I had to update the configure scripts in order to run them on Red. After running them, the library built normally. I plan to profile it in Tuesday’s class.


Though this package seems to be used mostly “as-is”, and seems to have little need for maintenance, I get the sense that a patch for ARM64 architecture might be well received. Updates to follow!

by ksedgwick at November 18, 2014 05:20 AM

Ava Dacayo

Release 0.3 – Bug search – catching up on blogging

I attempted to start early on working on my third release and but I did not expect I would get stuck. I wanted to look at other projects available but I started looking again at webmaker-app. As I was trying to run it though, I got an espirima error and I wasn’t quite sure how to go about it (Another classmate blogged about it later on and filed an issue on github). I fetched the newest codes from upstream just to make sure that it wasn’t because my current version is not updated, but it still failed. I cloned from the original repo because I was thinking that version should work since it’s up there on github and I don’t see anyone posting anything about the same error. Bad mistake.

So I waited, thinking it’ll get fixed soon. One week came, still the same, so I decided to switch to appmaker. I followed the Getting Started section and managed to setup my environment and then looked for bugs.

Media Player brick always shows double video caught my eye but decided to look for another bug. Couldn’t find one so I decided to take a look at it and managed to reproduce the issue. I spent a few hours setting up looking at this and trying to understand why it’s happening and the due date for release 0.3 is coming so I thought, what the heck, might as well work on it right?

Part 2 of bug search will be posted tomorrow…

by eyvadac at November 18, 2014 05:17 AM

November 17, 2014

Andrew Smith

I’m ashamed I wrote this


for L in `cat lang.txt | cut -f 2,3,4,5 -d' ' | sed 's/^.//' | sed 's/.$//' | sort`; do echo -n "$L "; done

More disgusting:

cat lang.txt | sort | awk '{ a=substr($2$3, 2); sub(")$", "", a); print "    \""$1"\", \""a"\", \"The <a href=\x27\x27>OSTD</a>\"," ; }'

It reminds me of when I had to learn perl and all of it looked like this.

I did need the code, and it was throw-away, a one-time (actually two-time) fix for a problem I’ll never encounter again. So why am I ashamed I wrote it? Because I sort of enjoyed it. I never liked people who traded readability for quickness and a low number of lines-of-code, and I hope I won’t become one :)

by Andrew Smith at November 17, 2014 05:36 AM

Hunter Jansen



Written by Hunter Jansen on November 17, 2014

Today’s post is a super quick update on the ZeroMq progress I made for the Linaro Performance challenge.


As I ended off my last post, I mentioned that I’d be working on ZeroMq next - hoping to end my streak of inadvertently choosing projects to port from x86 to arm64. It turns out that zeroMQ is pretty widely used, and a very active project with a lot of . . . subprojects as well. Their main site is here, and their github repo is here. This is probably the project that I’ve worked on with the most legible docs and the most organized one I’ve come across.

Too bad it’s already got an arm64 package.

Getting Started

So the first step I’ve taken to trying (since I’ve had to go through so many packages by this point) is to simply install via yum.

sudo yum install zeromq

Not TOO surprisingly (though a tad frustratingly), an aarch64 specific package got pulled in on the arm64 machine. Welp, job’s done! Right? I decided to go through and double check functionality was the same on each system as far as I could tell. So first I pulled the package in with fedpkg and prepped it. From there I ran the config and make files.

fedpkg clone -a zeromq
cd zeromq
fedpkg prep
cd zeromq-4.0.5

All these steps work on both systems. So next was to see if I could try some tests.


Surprisingly to me, there don’t seem to be any unit tests for zeroMq. Their site does mention performance testing, and there’s a folder in their source for tests; but nothing on how to execute a test suite.

Since ZeroMQ is a messaging kernal then, in order to write any tests for performance, I’d have to have it and it’s testing tools set up on two separate boxes. As a result, I’ve decided to move on from zeroMQ and onto the next package.

It’s also probably worth it to note that in my travels I found a bug report for zeroMQ not working on arm64 fedora from sometime early 2013. That issue has since been closed (as it should be).


SOOOO, next time I’ll be looking at bowtie. Unlike most of the other packages before it on the linaro list, I randomly chose this one due to it’s fun name and not because it sounded interesting. Will Bowtie be the package that breaks the streak of packages already ported or not needing work?

Tune in next time!

November 17, 2014 12:00 AM

Andrew Li

Release 0.3

For release 0.3 I worked on fixing an Appmaker bug in the Drawing Canvas component brick. The “Drawing Canvas” brick was not working on the latest Firefox and Chromium browsers. Upon clicking on the “Drawing Canvas” brick we should expect to see a line drawn or dot but what actually happens is nothing gets drawn.

Before even beginning the debugging process, I had already run into problems with my development environment. After updating my local repo with the latest upstream from Appmaker I found that I could no longer successfully run node app to get Webmaker server started. To get it working again I had to remove my local repo, re-cloned from my forked origin of Appmaker and reinstalled node packages with npm install.

My approach to fixing this bug was to figure out what was causing it to break so I performed a walk through on the code using a mix of Firefox and Chrome’s web developer debug tool. The developer debug tool was something I was always aware of but never actually took advantage of. After seeing the debug tool in action during one of our DPS909 lecture it demystified it for me and it turns out it is easy to use and is an indispensable tool in the debugging process. The debugger tool provides a wealth of detailed information which is especially helpful for someone new looking at a new code base who wants to figure out what the code is doing. For the first two releases I was walking through the code in Sublime and writing alert statements to figure out what was happening. The debugger tool definitely helps save time. Hovering over variables while stepping through lines of code lets you see variable values. From that moment on I said goodbye to my old alert ways.

While playing around with the “drawing canvas” brick in debug mode I found that if the Read Only check box on the “Drawing Canvas” brick was toggled on and off then the brick would start to work again. Thus putting a breakpoint on that line I was able to check the logic at that break point using Chromium’s debug tool. Using the tool I was able to see the value of the readonly variable (which handles the toggle state of whether the canvas will allow editing) discovering it was set to empty ''. The code in the drawing canvas component by default sets readonly to false which is what we expect. Upon initial load of the Webmaker app before the user even adds the drawing canvas component, the readonly variable in drawing canvas component gets initialized to false as observed in the debugger tool. Which is what we expect but what we don’t expect is that the readonly variable’s value gets replaced to empty "" after the drawing canvas is added by the user - as observed in the debug tool. Because the readonly variable gets reset to an empty value "" the else statement gets triggered which triggers the block that sets the canvas to readonly mode. In the the end, the option that I went with to fix this bug was to ensure the else statement only gets called when readonly variable’s value is only true which stops it from getting triggered when the variable is empty. That fixed the drawing canvas to do what we expect.

After testing in Firefox and Chromium, I went over Appmaker’s submitting a pull request readme and followed their procedures for preparing the patch.

Here is the pull request #2359.

November 17, 2014 12:00 AM

November 16, 2014

Edwin Lum

More on porting a package, actually more like a rant, and a wall of text

As it turns out, the continuation of the adventure of porting and optimization of open source software packages is quite eventful and surprising at the same time. It is even more apparent after Tuesday’s class of SPO600 this week, in which everyone gave a short presentation regarding the progress of the package(s) that they are working on. One thing is for sure, the range of problems run into, the state of their porting, as well as the sheer number of packages that some people have managed to mark as complete were drastically different; it was very astonishing, yet it made a lot of sense taking into account what kind of diversities open source can bring into a project. Some of us have already marked a few packages as complete, but most of us were still working on our first.

Of the projects currently in progress in our class, we have examples of where there were a lot of platform specific in-lined assembly atomic instructions in which one of my colleagues was able to find GCC equivalents (making it no longer platform specific), convert them, test them, and have them all work perfectly. Yet on the other hand, there was an example in which packages have changed their repository for their upstream, and half their code-base yet not have an announcement in their now, out-of-date repository that still has recent contributions, or upstream that are non responsive altogether.

Another core component of what we have to do when porting software over is to benchmark, profile, and run enough tests to proof that the new solution is a viable one. Sometimes, this gets quite tricky. Some packages, if we are lucky, comes with a test suite, we run the test, and everything passes. Some have test suites that no all the tests contained are expected to pass, and some don’t have a test suite at all! Creativity comes in when solving these problem. For example, Chris wasn’t going to write an entire garbage collector to test out a package which provides tools to write a garbage collector, but he can find an existing garbage collector that uses this library as a test case.

As for how my package, eigen3 stacks up compared to my colleagues, I ran into a few interesting issues. At first, there was quite a bit of learning involved as I was not at all familiar with cmake, and although I knew gcc was both a compiler for c and c++, I did not know that I had to set the CXX flag to gcc-c++  (g++) in order for it to work properly.

Moreover, the inline assembly that were present in my package did not have c fallbacks, but did things like modify how to exchange data depending on the cache size. As such, turning it off and comparing to the fallback was not an alternative.

With some luck, a test suite is included in the package. After I got eigen3 to build on aarch64, the goal seemed a lot closer, especially with a test suite in sight. However, as life is full of fun, it turns out this is one of those suites where not all the tests were expected to pass. That is still fine right? We can always compare to x86….. not quite T_T.

In my test suite, some tests that failed on the x86 would pass on the aarch64, and vice versa, it was really somewhat mind boggling. Luckily, the passed tests were in the 98% area for around 650 tests. Upon looking at some of the tests that failed, the decision was made to make some logical judgment calls, such as whether those tests seemed important, whether the faults looked like they were due to the different architectures, (some faults were segmentation faults), and whether I knew enough in the first place to investigate further.

Of the failed tests, one of them seemed really important, and failed on the aarch64, and it was called dynalloc. Oh no, yeahhh, dynamic allocation of memory seems pretty important, right?

Upon narrowing down to a test to look at, the offending line seemed like the alignment seemed off. Chris Tyler mentioned he encountered something similar in his project, and it boiled down to setting the alignment value lower because it is used as a modulus for the most part, due to how the memory was aligned, and if 16-bit alignment failed, 8 would work on aarch64, and surely 1 would work as a last resort.

To my dismay, both values of 8 and 1 failed. I was pretty stuck, with so many mysteries unsolved (most tests passing, and some not; tests failing on x86 passing on aarch64, fixes that should work and not), it was time to seek help.

After my presentation of my status report during class, it was such a sigh of relief when Chris noticed that there was a comment a few lines lower down the page that mentioned “check with valgrind”. Well I had no idea what this was, but upon investigation, Chris notifies me that this is probably the cause of my problems. Valgrind, which is used to test and find memory bugs did not have aarch64 support. As such there was no way it could have been doing dynamic allocation of memory testing properly in my test and that it is no wonder that the test failed for me.

Such a series of event surely felt quite unique, and one of the frustrations of not knowing enough to get the job done. Don’t I wish that I knew what valgrind was? Why can’t I just know everything :), but I guess there’s always going to be things that come with experience, and the only way to do that is to keep learning.

One thing I have experienced and enjoyed a lot so far working with these open source projects in SPO600 is that it is for sure my most exciting course. Most of the other courses do not change with each offering, performing the same tried and true labs and exercises in the curriculum. It is in this one that we get so much diversity and problems that have not necessarily been solved. We are not expected to come with with a solution to every problem, but in the process of looking for a solution we learn many new things. I believe this to be one most often recurring things that I see in this course, open source, and I am starting to really appreciate it.

by pyourk at November 16, 2014 08:14 PM

November 15, 2014

Yasmin Benatti

Release 0.3 Translation for Mozilla’s Projects

My third release is kind of a continuation of the second one. I ask sorry for the post before be so short. I’ll explain more in this one and show some print screens, to illustrate what I’m saying.

I’m doing translation for Mozilla, using the platform called Transifex. It is a web site where you can join projects and teams, and post or suggest translation for sentences. On the picture below you can see a whole picture of how it works. I was working on appmaker’s project, translating from English to Portuguese BR. This last month I worked on the Mozilla 2014 EOY Campaign’s project. Something interesting too is that I can use the web site on the language I feel more comfortable with, in this case, Portuguese (it also helps me to try to keep a consistence with the way Mozilla translates their web sites).

App Maker

App Maker


Now I’m being able to not just translate but also review translations. It is a bit of work because I’m trying to make it looks like if it was written by one person and looks like other Brazilian’s web sites. Things that I see a lot are literal translations or translations that look like Portuguese from Portugal. You can see this on the picture below. Some of the words need to be not just translated, but adjusted to the contest of the Country.

Adjusting Translation

Adjusting Translation

Other examples of this is “First Name” and “Last Name” or verbs conjugations (they can get kinda complex in Portuguese, once there are lots of rules and lots of exceptions of the rules).

On the platform, there is a dashboard, where you can follow up the progress of the projects during a period of time. On the first picture below it shows how much of translation was done on Mozilla-2014-EOY-Campaign’s project for every language. The second picture shows how much of translation I did for the same project.

EOY Campaign All Languages

Mozilla EOY Campaign All Languages


Mozilla EOY Campaign My Translations

Mozilla EOY Campaign My Translations


That was basically it for this release. I also went back into other projects to review what was missing. This is the kind of work that I won’t stop doing just because I got to the due date for the class. I’m still looking every week for changes that were made and I trying to keep up to date with the new projects that are posted (like the EOY Campaign).


by yasminbenatti at November 15, 2014 05:11 PM

Hunter Jansen

JS Coverage

JS Coverage

Written by Hunter Jansen on November 15, 2014

Last time I discussed the difficulties I was facing in finding a project to actually work on for the Linaro Performance challenge. Every package I’d decided to work on was either done already in one manner, or another or had drastically changed from its existing version. This post covers my next investigation into JSCoverage, a code coverage tool for javascript.


When I ended my previous entry I had only just barely started with my investigation into JSCoverage. Following that, I decided to look for the project’s repository’s so that I could see what would be needed to bring it to Fedora both in x86 and arm64. I actually had a really difficult time finding any form of repo for JSCoverage and ended up asking my prof. Chris Tyler for a bit of help.

Without too much difficulty he found this website (In hindsight, I’m not so sure why I was having issues finding it). In the notification banner at the top, it’s stated that any future work will be done in the JSCover project here. From there I went to their github repo (here) and the first thing I noticed was that it said the project was built with Intellij IDEA, which is a popular Java IDE.

I immediately checked with github’s language statistics tool and discovered that the program was 61% java and 37% javascript. So, it looks like I’ve picked another dud, since Java’s supposed to be a build once, run anywhere approach :(. However, I decided to continue on and ensure that the project works on both architectures. I figured I’d post the steps I took to get this done.

Getting Started

The first step is to clone the repo to the machines. Note: I used exactly the same commands on both machines.

git clone

Following that, I changed to the JSCover directory and did an ls; there were a bunch of xml files and a few references to maven, which typically means an ant build. Having an ant build is pretty common for a java project and since the repo tells me to run an ant command, I first needed to install ant on the systems:

sudo yum install ant

And, since the ant build runs junit tests, it’s also required to install ant-junit:

sudo yum install ant-junit

Following that, all the dependencies should be installed, so we run the projects tests via:

ant pre-commit

This then executes all the tests to make sure you haven’t broken anything before committing. The x86 system completes in about 2:27 and the arm system takes about 3:27. Normally the discrepancy between the two times would concern me a bit, however the rest of the class has found similar results with their projects where our arm machine runs slower than the x86 machine. Because of this, I’m not too concerned as the difference should be due to the machine differences.


I’m checking in with my prof how exactly to go about dealing with this on the linaro site, as it’s a different project name and arguably an entirely different project than it started off with.

The next project I’m investigating is called ZeroMQ, it’s a lightweight messaging kernel with optional asm for timer access. I’ve yet to actually start investigating, but as soon as I get going on it, I’ll be sure to post about it here.

Until Next time

November 15, 2014 12:00 AM

November 14, 2014

Shuming Lin

Open Source: {less}

At Monday, one of our classmate introduce a open source which was {Less} in OSD class. After his presentation, I am pretty interest in less.

So what’s the “Less”? 

Less is a CSS pre-processor, meaning that it extends the CSS language, adding features that allow variables, mixins, functions and many other techniques that allow you to make CSS that is more maintainable, themable and extendable.Less runs inside Node. Most JavaScript project using “Less” such as, Brackets, Web-maker and other open sources they are using less for UI design. However, It pretty easy to using Less. Less can be used on the command line via npm, downloaded as a script file for the browser or used in a wide variety of third party tools. If you want create less file, it same as to create a css file(less file name:  .less) .

Less Designed by Alexis Sellier and Developer by Alexis Sellier and Dmitry Fadeyev. Less is a open source, so the license under Apache License 2.

There have very good Less Feature such as “Nested Rules”, “Variablesa”, “Operations” and others.

1. Nested Rules

Less gives you the ability to use nesting instead of, or in combination with cascading.



#header { color: black; }

#header .navigation { font-size: 12px; }

#header .logo { width: 300px; }


#header {

   color: black; .navigation { font-size: 12px; }

   .logo { width: 300px; }


which less much better than css in nested.

2. Variablesa

These are pretty self-explanatory:

@nice-blue: #5B83AD;
@light-blue: @nice-blue + #111;

#header {
  color: @light-blue;


#header {
  color: #6c94be;

Note that variables are actually “constants” in that they can only be defined once.

3. Operations

Any number, color or variable can be operated on. Here are a couple of examples:

@base: 5%;
@filler: @base * 2;
@other: @base + @filler;

color: #888 / 4;
background-color: @base-color + #111;
height: 100% / 2 + @filler;

The output is pretty much what you expect—Less understands the difference between colors and units. If a unit is used in an operation, like in:

@var: 1px + 5;

Less will use that unit for the final output—6px in this case.

LESS is a great  little tool that extends CSS with the addition of variables, operations and nested rules etc. which means is that you can write leaner code very quickly. Less is not a bad choose to use.  Want to know more about less – click here



by Kevin at November 14, 2014 07:58 PM

Yoav Gurevich

0.3 Milestone Completed for the Price of Potentially Digging My Own Grave

Link to the PR:

As stated earlier, landing on this issue was a bit of a fluke. Well, since most of the bugs are kind of flukey in nature, this might more of a fluke of a fluke. As mentioned previously, Mozillian Scott Downe was away this week to assist me in a bug of his that I initially wanted to work on, so within my time constraints I needed to quickly switch over to something reasonable, which ended up being this issue.

If you take a peek at the conversation about this, this is a realization of my iterative approach to providing this problem an elegant solution. Currently, I added a menu item to the top right initialization context bar that opens a new window to the user that directs them to the app template gallery to remix an app, on a click event.

Implementing logic for this was fairly simple and streamlined, since I had the original 'New App' menu item logic to follow and parrot the data flow to a great extent. The Appmaker convention herein seems to prescribe to an approach that involves an event listener attached to a JQuery connected ID field in the html, whose logic is embedded in the same file. The parent init function exposes the scope of an object in a file both called 'application', that exports a whole bunch of functionality - most importantly of which was the functionality of the menu item that I was trying to emulate:

// application.js, line 36:
    newApp: function(){
      var app = document.querySelector("ceci-app");
      var parent = app.parentNode;
      // TODO
      // we have to decouple appidChanged and initFirebase
      app.setAttribute("appid", "ceci-app-"+uuid());
      history.pushState({}, "", location.origin);

I didn't need all this fancy functionality yet either; so far, the intent was just to give the user a link to the template gallery. So a simple on an href should (and did) do the trick:

  templateApp: function(){
    // Open the link in a new tab so as to not interrupt the current instance's workflow

The module.export already takes this entire function list wholesale as part of the application scope, so simply adding this function to the list was enough to be able to invoke it on my event listener back in the html file.

The first and last part was essentially to just add the element to the dom, which for the most part was just duplicating the code used for the initial menu item. Funny enough though, the most convoluted part of this was adding the text label to my new menu item. Appmaker uses what I believe to be some sort of angular.js localization directive - l10n - which I didn't know worked in a key-value pair fashion. This is where parroting code unfortunately stopped being helpful ;)

The directive is appended to your element like so: {{'your label here' | l10}}

If your 'label' pair isn't listed in the msg.json file in the language locale directory, it doesn't simply slap the string onto the element, it actually doesn't do anything with it, and therefore your label doesn't show up at all. Documentation and examples on the web were hard to come by for me, so some last minute pointing in the right direction from David Humphrey (current professor, former project lead) cleared this issue up in a hurry.

The title for this post alludes to the hope that I don't bury myself under the weight of my own ambition for this bugfix for the final course's milestone. The stage is officially set for my last hurrah and the opportunity to call it my Appmaker's "Magnum Opus" addition.

by Yoav Gurevich ( at November 14, 2014 06:43 PM

Kieran Sedgwick

[OSD600] Release 0.3

Like my 0.2 release, my 0.3 release is a number of separate patches, also related to fixing Filerjs bugs. This includes a few new bugs, but also the follow-up work to the bugs I tackled in 0.2.

FileSystemShell constructor (Issue #245)

This went back and forth, but was largely untouched until it was merged in a few days ago. linky

Bundle filer-test.js with gh-pages only (Issue #312)

This issue continued to evolve, mainly because it solved a secondary problem which had to be factored out of this patch and into another one (below). After revising it, it landed early this week.

See it here: linky

Prevent grunt publish from destroying gh-pages (Issue #208)

This bug came out of my work on the build system. I had (in confusion) set up the grunt publish task to force an overwrite of the gh-pages branch with the contents of develop. This was problematic!

It was important to leave the gh-pages branch alone, except for keeping the browserified test suite up to date. This presented a challenge: How to generate a file on a branch where the file isn’t tracked, and then commit that file on another branch where it is?

The answer was leveraging git’s stash functionality. A second problem I encountered was ensuring that the publish process finished successfully, even if the tests hadn’t changed. The new build process relies on a new commit being created on gh-pages for every release, based on a new test file. I decided it was easier to arbitrarily modify the file to force a difference rather than to try and detect if the tests had actually changed.

I found a grunt plugin that allowed me to add a banner to the file, ensuring it would be unique and allow the publish process to complete itself.

See the PR here: linky

As a part of this, I had to add an entire feature to the grunt-git module, similar to the last release. See that PR here.

Unlike last time, the owners of the repo haven’t merged the changes as of the time of this posting, so I filed a follow-up bug to point filer’s package.json at their repo when the merge does happen. Linky

Update references to fs.Shell in the docs (Issue #323)

In the back and forth of the FileSystemConstructor bug, the documentation got left behind. This patch corrected this. Linky

by ksedgwick at November 14, 2014 05:45 PM

Ryan Dang

0.3 Release

So a week ago, I decided to start working on my 0.3 release by choosing one of the existing bug on Webmaker app repo. Since I haven’t have time to worked on any new issues for a while, I need to pull down the latest update and run npm install on my machine. After everything is up to date, I try to run the app on my local machine and I ran into some weird error relating to esprima that what I thought initially. The error messages are not really helpful, I spent hours trying to figure out if there anything wrong on my end and also tried to debug it but there is no luck. I tried to run it on both my laptop which run window 8.1 and my pc which run window 7. Feeling like I’m heading into a dead end, I decided to ask about this error on the #appmaker irc chat. After discussing with Pomax and k88hudson, they figured out that there is issues with gulp-browserify. The issue is filed on their repo Gulp dev throws errors related to esprima. Apparently, the issue is caused by gulp-browserify using outdated browserify and browserify-shim. Pomax decided to file an issue on gulp-browserify repo and ask them to update their dependencies please update browserify. You’re requiring 3.x, the current version is 6.x. It turned out the gulp-browserify project is discontinued and is blacklisted by many open sources project. K88hudon later fixed the issue by replacing gulp-browserify with just using browserify. However, after the issue is fixed, I was left with not much time to really working on anything as I work 2 days and have school for the rest of the week.

Luckily, I have a spare pull request that I done a while back which is [#266]Fix issue with templates on Firefox. This issued was found by me after playing with the system and noticing there is an issue with displaying the template on Firefox. I filed the issue Templates doesn’t display properly on Firefox on there repo and spent sometime playing around with the css to get the templates to display correctly.

by byebyebyezzz at November 14, 2014 02:47 AM

November 12, 2014

Gabriel Castro

Counting to 30 on AArch64 with Assembly

Counting to 30 on AArch64 with Assembly

Today I will be comparing the code of a simple loop in assembly written in AArch64 asm to one written in x86_64 asm.

In a previous post I wrote about making a loop work in x86_64 asm, the final source is on GitHub.

This post will focus more on the differences between the two architectures rather than just writing the program again. This is because the over all structure of the program remains the same.

The full source of the program is

.globl _start

start = 0 /* starting value for the loop index */
max = 31 /* loop exits when the index hits
this number (loop condition is i<max) */

mov x20,start /* loop index */


# skip 0 to avoid div 0
cmp x20,0
beq print

# put x20/10 => x21 && x20%10 => x22
mov x10,10
udiv x21,x20,x10
msub x22,x10,x21,x20

# add '0' to both characters
add x21,x21,'0'
add x22,x22,'0'

cmp x21,'0'
bne not_suppress_zero
mov x21,' '


# move the charcters into the string
adr x23,msg
strb w21,[x23,6]
strb w22,[x23,7]

# call print
mov x0, 1
adr x1, msg
mov x2, len

mov x8, 64 /* write is syscall 64 */
svc 0

add x20,x20,1 /* increment index */
cmp x20,max /* see if we're done */
bne loop /* loop if we're not */

mov x0,0 /* exit status */
mov x8,93 /* syscall sys_exit */
svc 0

.section .data
msg: .ascii "Loop: 0\n"
len = . - msg

and on GitHub

Some Key Differences from x86_64.

  • Register names are consistent x0 to x30 in AArch64
  • syscall is replaced with svc 0

Just as important differences.


Aarch64 has an instruction called udiv for unsignened integer division, unlike x86’s div it dose not reqire the use of specific regisiters for input or output, it just takes arguments for all three. It also dose not automatically calaculate the remainder. To compensate for that we just use the msub instruction after to calucute the remadiner.


Just like division adding dose not require specific registers, and it has the advantage of giving us the ability to specify input, input and out registers. Also note that Aarch64 dosn’t have an increment instruction, we just have to use add instead.

adr instruction

This is an instruction that is differenet from x86 and it is used to load an adress into a register

strb instruction

To modify the string data in x86 we had to load the address of the string + the offset for the character, then store the first digit, increment the address, store the second digit. On AArch64 it’s a bit simpler because the store byte instruction strb allows you to give it both an address plus an offset from it.

by Gabriel Castro ( at November 12, 2014 08:23 PM

Counting from 0 to 30 in x86_64 assembly

Counting from 0 to 30 in x86_64 assembly

This post will be a tutorial for how to count from 0 to 30 using x86_64 assembly the counter part to this post for how to do it for AArch64 is coming soon.

Start of with a simple program that loops 10 times but dose nothing

.globl _start

# where to start the loop
start = 0
max = 31

# put loop index in r15
mov $start,%r15


# empty loop body

# increment index
inc %r15
# compare
cmp $max,%r15
# go back to the top if it's not done
jne loop

# exit 0
mov $0,%rdi
mov $60,%rax

Result (on Github)

This code takes the value of start puts it in r15 then increments, checks if it’s equal to end, if not it jumps to loop: to run again.

NOTE The first iteration of the loop and one increment is run always, so if start == end the loop will run until start overflows and comes all the way around

Let’s fill the loop

First off add a data section to our code that has the string we want to print by adding the following to the bottom of the file

.section .data
msg: .ascii "Loop: 0\n"
len = . - msg
dig = msg + 6

Result (on Github)

msg: contains the address of the string we want to print

len = contains the length of the string

Now the loop body needs to print the string which is syscall 0x01 and takes the address of the string and the length. Which means we need the following in the body of the loop

mov $1,%rdi
mov $msg,%rsi
mov $len,%rdx
mov $1,%rax


Changing What it Says

To change what the program print we will modify the memory in .data msg: directly, for most programs written in a higher language modifying .data directly is a very bad idea and shouldn’t be done, it should be copied to the heap and modified there, but we’re already writing in assembler so we can get away with unsafe code.

Here is the rest of the loop body

    # zero is a special case, we can't divide by zero
# but the string data already contains "Loop 0\n"
# so skip right to printing it
cmp $0,%r15
je print

# put n/10 in r14 && n%10 into r13
# integer division requires
# rdx = 0
# rax = dividend
# <any> = divisor, we'll use r12
mov $0,%rdx
mov %r15,%rax
mov $10,%r12
div %r12

# div gives us the result in rax and the modulus in rdx
# this is perfect for us because we actually need both of them
mov %rax,%r14
mov %rdx,%r13

# add "0" to both r14 and r13
# this converts single digit integer numbers to
# their acsii character values
add $'0',%r13
add $'0',%r14

# if the first digit is a '0' we'll replace it with a space
cmp $'0',%r14
jne not_zero
mov $' ',%r14


# move the address for the character into r13
mov $dig,%r12
# move the first character into the address in r13
mov %r14b,(%r12)
# move the address to the next character
inc %r12
# move the second character
mov %r13b,(%r12)

Result (on Github)

The output once we compile the program is expectedly

Loop:  0
Loop: 1
Loop: 2
Loop: 3
Loop: 4
Loop: 5
Loop: 6
Loop: 7
Loop: 8
Loop: 9
Loop: 10
Loop: 11
Loop: 12
Loop: 13
Loop: 14
Loop: 15
Loop: 16
Loop: 17
Loop: 18
Loop: 19
Loop: 20
Loop: 21
Loop: 22
Loop: 23
Loop: 24
Loop: 25
Loop: 26
Loop: 27
Loop: 28
Loop: 29
Loop: 30

by Gabriel Castro ( at November 12, 2014 07:40 PM

Ava Dacayo

I apparently can’t spell github

As I was about to look for issues yesterday for the nth time, I misspelled the address and this showed up on my screen. The internet never fails to amuse me! :)

(I’ll post my bug hunting adventure next time. – Will be working on issue #2353 Media Brick always shows double video)


by eyvadac at November 12, 2014 03:27 PM

November 11, 2014

Rick Eyre

Lessons on Best Practices from Mozilla

One of the ways we facilitate improvement in development processes and the like at EventMobi is by having lunch and learns where someone will present on something interesting. Sometimes that's a cool technology they've used or a good development practice they've discovered or have had experience with. To that end I gave a presentation on some of the lessons I've learnt on best practices while working in the Mozilla open-source community.

Allthough many of these best practices may seem like no brainers to seasoned developers, I still hear way to many horror stories through the grapevine about software being built under seriously bad conditions. So, without further ado.

Code Ownership

One of the things I think Mozilla does really well is the idea of code ownership. This essentially means identifying those who have a level of knowledge about a particular area or module of code and thrusting upon them the responsibility to oversee it. From a practical point of view this means answering questions about the module that others have and also reviewing all of the changes that are being made to the module in order to ensure that they are sane and fit into the larger architecture of the module as a whole.

Mozilla does this really well by having clear definitions of what code ownership means, who code owners are, and who in the owners absence, can make decisions about that module.

The key part to this set up in my opinion is that it makes it clear what the requirements to become a code owner are and what their responsibilities are as a code owner. Too often I feel like, as with other things, if they aren't formalized they become subject to, well, subjectivity. And the details of code ownership and responsibility get lost in translation and hence, not enacted.

Bottom line, formalizing your code ownership policies and processes are a foundation for success. Without that it becomes unclear even who to ask to review code, is it the person in the blame? Possibly. Maybe not. Maybe that person didn't make correct changes, or made changes under the guidance of someone else. Maybe the code your changing has never even had a true 'owner'. No one knows the big picture for that piece of code and so no one knows enough about it to make informed decisions. That's a problem. If a code owner had been designated when that code was written that would never have been an issue.


We all know testing is a must for any sane development process. What I've learned through Mozilla is that having an insane amount of tests is okay. And honestly, preferable to just enough. As just enough is hard to gauge. The more tests I have in general, the more confident I feel in my code. Having a gauntlet of tests for your code makes it that much stronger.

Not only that, but it's important to be staying on top of tests. As a rule, not accepting code before it has a test and adding regression tests for bugs that are fixed. In exceptional cases code can merged without tests, but it should be tracked so that tests for it will be added later. Ten minutes spent writing a test now could save hours of developer time in the future tracking down bugs.

Saying No

This is one of my personal favourites. And especially relevant I think in companies which are, in my experience, more driven to say yes—in order to hit that extra profit margin, to please that extra customer, to do whatever—as opposed to open-source projects who are able to say no because most of the time they have no deadline. They're driven by desire to build that next cool thing, to build it well, and to do it in a sane way.

From my experience working in Mozilla's and other open-source communities I've found it's important to say no when a feature isn't ready, when it's not good enough yet, when it needs an extra test, when it's not necessary, or when you just ain't got no time for that. I do think, however, that it's hard to say no sometimes while working under the constraints of a profit driven process. There is a healthy balance one can achieve though. We have to strive to achieve this zen like state.

Managing Technical Debt

One of the main ways I've seen this done is by ceating tickets for everything. See something off about the code? Log a ticket. See something in need of refactoring? Log a ticket. See something that needs a test? Log a ticket. Any kind of piece of work that you think needs to get done, any kind of open question that needs to be answered about the state of the code, log a ticket for it.

Logging tickets for the problem gives it visibility and documents it somewhere. This enables a few things to happen. It enables communication about the state of your code base across your team and makes that information easily accessible as it's documented in your tracking system. It also puts the problems that are not necessarily bugs—your stinky, ugly, untested code, or otherwise—to be in your teams face all the time. It's not just getting swept under the rug and not being paid attention too. It forces your team to deal with it and be aware of it.

The key part of this strategy then becomes managing all these tickets and attempting to understand what they are telling you. One of the ways you can do this is by doing regular triages. This means going through the open tickets and getting an idea of what state your code is in and prioritizing how to go about fixing it. This is key as it turns the information that your team has been generating into something actionable and something that you can learn from.

by Rick Eyre - ( at November 11, 2014 03:30 PM

November 10, 2014

Jordan Theriault

{less} Case Study: Why Less is Releasing CSS From it’s Cage

Authoring in traditional CSS3 feels like a lion trapped in a cage situation. Among many different extensive and useful programming languages available for web development, styling the web has consistently been a difficult if not often frustrating task. CSS is void of many popular, useful, and frankly logical code paradigms that we are already used to. Throughout three iterations CSS has remained largely a game of playing with inheritance and mind-numbing repetition.

Enter less.js which is now the standard for most popular web projects. Less is a CSS pre-processor that extends the CSS language by adding many common programming paradigms. Essentially, Less transforms CSS into the full-fledged language it should be.


Client vs. Server

Less can be run on both the client and server side. This is an incredible ability to have and is generally a new  concept to the programming world. However, Less on client-side is to be used with great care despite being the most common learning practice for programming using it. Putting the extra processing requirement on a client is not recommended, and using this method is best done in the development stage. Ensure native CSS files processed from .less files are used in the production copy of a project. In order to make this easy for production builds, you can include a Less step in task automation. Using a tool like Grunt to automate the processing is easy to do, and is now common place for developers.

However, for small projects or development you can include less.js within your header along with your .less files to unleash it’s potential quickly and easily.

If you’re not using {less}, you’re creating more work for yourself.

Added features include:

  • Variables
  • Mixins
  • Nested rules
  • Media query bubbling / Nested media queries
  • Operations
  • Functions
  • Namespaces and accessors
  • Scope
  • Comments
  • Importing

Simple Usage

In order to use less, you simply need the less.js file, and a javascript enabled environment. less files have the extension .less rather than .css and are referenced in the header.

To begin, include these lines in the head of your html file

<link rel=”stylesheet/less” type=”text/css” href=”style.less” />
<script src=”less.js”></script>

Now in your aptly-named style.less file, you can start to program using less.js

Variables are as simple as

 @variablename: value;

For example, within style.less to create a variable for the colour of my text within the element “textbox”:

@crazy-yellow: #FFFF66;
#textbox {
      color: @crazy-yellow;

When my page loads, It will look identical to writing

#textbox {
      color: #FFFF66;

The one of many features of adding a variable is extremely important. This can make an entire layout themable (changing the colours of the layout easily by variable overrides), can make authoring and tracking palettes simple, and in short will greatly reduce the amount of times you check for syntax mistakes.

This is just one implication of using less, and you can already see the benefit it has to both beginner designers as well as large projects. This is used in many popular projects like Webmaker, and Atom.

Less is open source and uses an Apache 2 license which means you are able to include it in personal, company (both internal and external), and commercial goods. Yes, you can even include it in your own package. The only caveat? You must attribute (and rightfully so). So what’s stopping you?

Do yourself a favour, give Less a try.

Find out more about less including docs.

Fork / Star / Contribute on Github.

Download the powerpoint to go along with this case study.

by JordanTheriault at November 10, 2014 06:25 PM

Yoav Gurevich

Unintentional, yet Necessary Bug-Hopping Ensues

Shortly after FSOSS and the end of reading week, I initially requested to work on an Appmaker issue created by senior Mozillian Scott Downe, AKA 'TheCount' (Link to #2338). My mistake from the get-go was not to immediately use the comment thread to clarify my lack of context for how to find the code necessary to fix the bug, instead leaving it for a while as life happened for both parties involved with the intent of asking Sir Downe at a more convenient time for me later on down the road. As fortune often shines it everloving light on me, the gentleman is now off on some sort of vacation/time away from his work for a week as my deadline for this milestone looms and is at the end of this week. Instinct dictated me to immediately find a suitable alternative, and I believe that I have found a manageable little number that might even serve as an incremental growth to a rather impressive final 0.4 effort (no promises yet, though) - Issue #2348

As usual, updates to soon follow after the initial PR at the end of this week.

by Yoav Gurevich ( at November 10, 2014 06:24 PM

Linpei Fan

Less.js – a CSS preprocessor

Today, Jordan Theriault introduced Less.js in the OSD600 class. Less.js gets popular recently. It is a CSS framework working with Javascript in client-side. I started to learn and use it when I worked Mobile Webmaker project.

Less.js have some advantages:
  • Improve performance. The purpose of Less.js is to speed up development of CSS. So using Less.js can improve performance in ways that can’t be done in normal CSS.
  • Significantly reduce sizes of style sheet files by well grouping things. Using Less.js, .css files can be well grouped and be separated into smaller and more meaningful .less files.
  • Easiness to operate. Less.js has following features:

·         Ability to define variables,
·         Operations and functions (Mathematical functions and operational functions)
·         Mixins, embedding all the properties of a class into another class
·         Nesting
·         Joining of multiple files.

Those features of Less.js make it easy to operate to group things and generate CSS files. Moreover, the developers are easy to modify .less files with those features by reducing the redundancy of the code.

In the future, I will dig deeper into Less.js with continuously working in Mobile Webmaker project.


by Lily Fan ( at November 10, 2014 06:02 PM

Ryan Dang

How to submit media file using Angularjs

So previously, I mentioned that I was having a lot of issues with submitting a media file along with the form data. The reason is that I don’t want to use the submit method from the form. I want to use $http to submit all form data and the media file.

So after looking in to angular-file-upload I was able to incorporate it to our project to accomplish what I want. So I would like to share with you how to get use angular file upload.

Step 1: install angular file upload

You can install angular-file-upload through npm or download it through git hub.

Step 2: HTML code

<div class=”medium-12 columns”>
<input type=”file” uploader=”uploader” nv-file-select=”” multiple>

<input type=”text” ng-model=”testdata” />

<input type=”submit” ng-click=”submit()” />

<!– Preview image –>

<div ng-repeat=”item in uploader.queue”>
<div ng-show=”uploader.isHTML5″ data-ng-thumb=”{ file: item._file, height: 200 }”></div>

Step 3: JS code

Create a new instance or FileUploader

$scope.uploader = new FileUploader({

url: “serverside url”,

formData: []


Only keep 1 item in the queue. If there is another item in the queue, it will remove that item before adding the new one.

This is to display one preview image only.

$scope.uploader.onAfterAddingFile = function() {

if ($scope.uploader.queue.length > 1) {




// insert data to send along with the file uploader. Make sure you have a parameter for function

$scope.uploader.onBeforeUploadItem = function(item) {

item.formData = [{ test: $scope.someVar }];


Handle call back

$scope.uploader.onErrorItem = function() {

// Handle error here


$scope.uploader.onSuccessItem = function() {

// Handle on successfully upload


Handle submitting. We have to handle submit the form and submit the media file separately. The reason is because if we use angular file upload upload method, it will not work if there is no media file selected. For our project, we want the ability to submit the form with or without the media file. So we have a service that handle form upload that return a promise.

$scope.submit = function() {


}).then(function onSucess() {

// we will upload the media file in here




by byebyebyezzz at November 10, 2014 05:59 PM

James Laverty

A change in environment

Hey everyone,

Just a quick update as to my progress/happenings. I just bought a new computer, a nice sleek Lenovo Yoga 2 Pro, pretty exciting! I'm setting it up as a dual boot running Windows 8.1 and Linux, I've decided to do this because hacking is easier in NOT-Windows.

I'm also fixing a little bug in App-Maker, it's a bit of a challenge, but hopefully working with Linux will make it all a bit easier.

As an aside, I've switched from Eclipse to Android Studio an so far, I couldn't be happier. I'm working on an android app and we put it up on GitHub, the integration was so easy I think I'm in love.

I'll give another update later this week!


James Laverty

by James L ( at November 10, 2014 04:54 PM

Kieran Sedgwick

[SPO600] Investigating a Package: Unrar-Free, and When Open and Closed Collide

I chose two packages to port and optimize in the Linaro challenge, and I’ve finished my investigation of the first one.


Unrar-free is a Free (as in speech) equivalent to the proprietary command-line tool unrar. The man who invented the RAR format, Eugene Roshal, licensed the decompression algorithm under the GPL and a decompression library (unrar-lib) was built around it by Christian Scheurer and Johannes Winkelmann.  With an eye to provide a backend for GNOME GUI applications wishing to decompress RAR files, unrar-free was created to leverage this decompression library.

Officially, the underlying library’s support dried up in 2007. This meant that Free support for decompressing files using RAR v3.0+ algorithms wasn’t, and still isn’t, available. This also severely limited the usefulness of unrar-free, which, in the light of more popular compression algorithms, and RAR’s progression by two three major versions, isn’t necessarily the end of the world. Despite that, RAR v2.0 and earlier archives are still supported by unrar-free and unrar-lib, making this package a target for porting to ARCH64 architectures.

Preliminary Investigations

The main GNU site showed that the library unrar-lib (the piece of unrar-free that was a candidate for porting) hasn’t had a patch since 2002, with the last recorded activity on the website being late 2007. Support for the project seems non-existent, and  I’ve reached out to the community to see if someone can figure out what the state of support is.

In the meantime, I attempted to start profiling the code. Both the command-line tool and the library were written in C, without any dedicated assembly files. The decompression library itself showed inline ASM code for the most time consuming functions according to the code comments. Building the library appeared to be straightforward, with fallback code for every ASM block.

Interestingly, the ASM blocks tended to mirror the structure as well as the function of the fallback code:

register unsigned int N;
 __asm {

    xor eax, eax
    mov eax, BitField                       // N=BitField & 0xFFFE;
    and eax, 0xFFFFFFFE
    mov [N], eax
    mov edx, [Deco]                         // EAX=N, EDX=Deco

          cmp  eax, dword ptr[edx + 8*4 + 4]// if (N<Dec->DecodeLen[8])
          jae  else_G

             cmp  eax, dword ptr[edx + 4*4 + 4]// if (N<Dec->DecodeLen[4])
             jae  else_F
// ...
// Versus

register unsigned int N;
  N=BitField & 0xFFFE;
  if (N<Deco->DecodeLen[8])  {
    if (N<Deco->DecodeLen[4]) {
// ...

I was also impressed with the code comments and their helpfulness at putting the ASM into context. I tried building unrar-free on x86 without any trouble. On aarch64, I had to replace the outdated configure scripts that generated the makefiles so that the new architecture would be recognized in the configuration process. This was the only speedbump, and after that I had the software built and running on both platforms.

This was where I ran into trouble.

Run-time issues

RAR as a format hasn’t gained much traction since the mid-2000s, and that lack of enthusiasm shows here in my struggle to get a RAR file extracted. Notice the lovely error message:

Huh. No kidding!

Huh. No kidding!

Some digging showed that I wasn’t the only one having this problem over the last few years. With Chris Tyler’s help I used strace to watch the program’s execution before it failed. I noticed it writing to a debug file! One I hadn’t found mention of in the documentation, which contained:

I'm like a detective!

I’m like a detective!

This clue could lead me to answers, such as a good place to debug the program and a likely point of failure (the cyclical redundancy checksum present with the RAR file). But was it worth it?

Considering I hadn’t heard back from any of the contributors I reached out to as of the time of this post, and the widespread difficulty using this software for years, and the availability of freeware equivalents, I decided to end my journey here.


This was not a waste of time, since the experience of digging through an open-source project’s history with a goal was very valuable. I also learned about some cool tools (the configure script and strace come to mind) which I’ll definitely need to use again in the future.

If my investigation into my secondary package, the LAME audio codec, ends prematurely, I’ll come back to this and continue the porting process.

by ksedgwick at November 10, 2014 05:59 AM

Shuming Lin

The JavaScript Garden Open Source

JavaScript Garden is “a growing collection of documentation about the most quirky parts of the JavaScript programming language. It gives advice to avoid common mistakes and subtle bugs, as well as performance issues and bad practices, that non-expert JavaScript programmers may encounter on their endeavors into the depths of the language”.

JavaScript Garden isn’t aimed at teaching JS noobs the ropes; rather, it’s supposed to refine the understanding of the language for current JavaScript programmers. Its creators, a handful of JavaScript experts, dole out well-organized advice on how devs can “avoid common mistakes, subtle bugs, as well as performance issues and bad practices.”

JavaScript Garden is create by Ivo Wetzel(Writing) and Zhang Yi Jiang (Design).. It’s currently maintained by Tim Ruffles.

As most open sources, JavaScript Garden are under the MIT license and hosted on Git Hub.

Some of the quirky parts that are covered include:

Objects: Object Usage and Properties, The Prototype, “hasOwnProperty”, The “for in” Loop.


// Poisoning Object.prototype = 1;

var foo = {moo: 2};
for(var i in foo) {
    console.log(i); // prints both bar and moo

Functions: Function Declarations and Expressions, How “this” Works, Closures and References, The “arguments” Object, Constructors, Scopes and Namespaces.

function foo() {}
foo(); // Works because foo was created before this code runs
function foo() {}

Arrays: Array Iteration and Properties, The “Array” Constructor

var list = [1, 2, 3, 4, 5, ...... 100000000];
for(var i = 0, l = list.length; i < l; i++) {

Types: Equality and comparisons, The “typeof”  operator, The “instanceof” operator and Type casting.

function Foo() {}
function Bar() {}
Bar.prototype = new Foo();

new Bar() instanceof Bar; // true
new Bar() instanceof Foo; // true

// This just sets Bar.prototype to the function object Foo,
// but not to an actual instance of Foo
Bar.prototype = Foo;
new Bar() instanceof Foo; // false

Core: Why not to use eval, “undefined” and “null”, Automatic semicolon insertion


(function(window, undefined) {
    function test(options) {

        (options.list || []).forEach(function(i) {


            'long string to pass here',
            'and another long string to pass'

            foo: function() {}
    window.test = test


(function(window) {
    window.someLibrary = {}






by Kevin at November 10, 2014 05:08 AM

Hunter Jansen



Written by Hunter Jansen on November 10, 2014

It’s been about a week since my last post, I was hoping to get some progress on one of my porting projects to fill with content and future plans. However, things haven’t quite gone that way - so in this post, I’ll be explaining why that is, what the plan was and what’s actually happened.


Last time I talked about how I was planning on working on JSL and Snort to port them from x86 architectures to aarch64. I focused in on a couple spots in JSL that needed looking at that should be updated to work on aarch64 - there wasn’t much, but there was something to do. Since then, I’ve run into some difficulties with the package. After investigating the files, I decided to check out the upstream and discovered that the source was entirely different.

Where the package I was working with was mostly C with some perl and a python file or two, the current source was almost entirely in python. I decided to reach out to the upstream authors to introduce myself and see what the status is of the project, but after more than a week without a response, it was definitely time to move on.


So, my second package I’d decided to work on was snort. It’s not available via fedpkg and it’s not included in rpm due to some complex licensing that the software has. Snort’s a very widely used network intrusion detection software worked on by Cisco, so I was initially surprised that it was even on the list of software that needed updating.

After going through the relatively lengthy process of getting snort and its dependencies up on my x86 system and building it successfully, I looked at the embedded asm in the source: there was a lot. Sifting through it, I saw a bunch of references to iarch64 in the preprocessor defines. So, for fun I decided to go through the same process of installing Snort and its dependencies on aarch64.

After everything finished making and installing, I ran a few snort commands and a super simple ruleset. Low and behold, as far as I can tell, it works just fine without any changes required on my part. I couldn’t find any test suite to run, but all the basic functionality and a super simple ruleset worked just as expected. To write a test of my own would be pretty tough - I’d have to set up a network, set up snort on it and try to intrude upon it from a different machine.

Neat, that’s a project done on the linaro performance challenge, I guess!


Luckily I’d chosen fio as my backup project to work on, I’d looked at the source for it before in class and there wasn’t too terribly much to update. There was definitely some asm to take a look at, but it looked manageable.

However, since I’d been ‘burned’ so to speak with my previous two projects, I decided to go ahead and for the heck of it try installing fio on my aarch system using yum. Lo and behold, it installed just fine.

So, I decided to check it out and test some fio rules; everything looked like it was working to me, and just as performant as on my x86 system. I ended up testing about four of five rules before deciding it was safe.

I guess that’s another projet to say’s done on Linaro.


I’m just starting to look at a package on ubuntu called jsCoverage, which acts as a code coverage tester tool for javascript, much like php coverage or any other code coverage tool. As of this point, I haven’t yet had much success; I haven’t even found the repository for it yet, only the ubuntu repos. Hopefully I make more progress and actually find some code to investigate and hopefully update!

Until Next time

November 10, 2014 12:00 AM