Posted on 4 Comments

Mass import products to Magento multi-store setup

Magento Logo

So, you are about to import a large amount of products into Magento. Well, there are a couple of different approaches available. First, you could use the ancient Data Flow import based on specific profiles. Then, you could use the improved Data Import version, which still is (very) resource consuming. Lastly, there is magmi – a highly efficient tool to mass import data to Magento, which is the preferred way to go in almost all cases when dealing with larger sets of product data in Magento. This post shows how to mass import products to Magento multi-store setups using magmi.

Export products

First, you need to export existing products as a CSV file. This can be achieved (again) using Data Flow, Data Import or a more elaborate database dump. Let’s assume at this point that you have a CSV file ready (very convenient, I know). In case you don’t, simply run the built-in Data Export funtion from within the admin backend (specify only those columns you definitely need to keep the processing to a minimum) or use the default Export All Products Data Flow profile. Edit this CSV to your needs so that you have a working set of products to be imported afterwards. Make sure at this point to retain the required columns so that you don’t run into problems afterwards. The columns required for the import process depend on the scenario at hand but rest assured you need at least sku, store and any additional attribute such as name or description.

Export data manipulation tools

Make sure to edit your CSV file with a tool that is capable of handling UTF-8 correctly and also supports setting proper delimiters. LibreOffice is a very handy solution to do this. You can easily use the Save as functionality to set proper delimiters for magmi to import your edited CSV afterwards. Use comma (,) and as column separators, which is the default setting. Here is a working sample CSV extract with the respective required columns in the first row:

"sku","description","short_description","name","store"
"sample-1","My shiny product for Englisch store","Shiny product","Product A EN","en"
"sample-1","My shiny product for German store","Shiny product","Product A DE","de"

CSV import schema

In order to mass import products for particular store views in a multi store setup make sure that the store column is set correctly with the store view code. Have a look at core_store table or at the admin backend under System / Manage Stores to determine to correct store view code.

Note: The regular Data Import interface provided by Magento will activate the “Use default” setting on various attributes of the imported/updated products, thus potentially doing something you are not intending with importing/updating products! Instead, use magmi and the store column to explicitely set set which products to import into which store view.

Note that you can also specify a list of store codes for the store column. The fastest and (imho) safest way to mass import products to Magento is Magmi – the Magento Mass Importer. Next, we are going to setup magmi and set up a profile to mass import products.

Setup Magmi

First, download and install magmi and protect it from outside access (e.g. via .htaccess). Note that by default, the web interface will be accessible by everyone who knows the corresponding URL! Open the web interface: http://www.your-domain/magmi/web/magmi.php and

  1. Edit the global settings
  2. add and edit the import profile

Edit global configuration

Set your connection details and Magento version at hand: MAGMI Global configuration

Create an import profile

Next, setup the import profile based on the Default profile and specify CSV as data source: MAGMI Profile configuration

Make sure to enable the Magmi Optimizer as this will speed up the process significantly, especially when dealing with thousands of entries.

Run import

Finally, choose your CSV import file and set the import mode. You can choose between three different modes:

  1. Update existing items only, skip new ones
  2. Create new items, update existing ones
  3. create new items, skip existing ones

Again, for multi-store setups make sure that the store column contains the correct store code view code. The import process should run pretty fast using magmi.

Reindex data

Optionally, you can choose to kick off the reindexing process using magmi (enable the option Magento Magmi Reindexer), run the indexer with the shell script provided by Magento or use the corresponding admin backend option.

Posted on Leave a comment

Generating optimized JavaScript Source Code using Dojo’s Build System

The Dojo Toolkit represents one of the most powerful and feature-rich JavaScript libraries to-date. Apart from its core functionality such as DOM manipulation (i.e. Dojo namespace) it provides developers with a vast range of widgets (Dijits, i.e. Dojo Widgets), lots of incubator code (DojoX) and numerous Utils classes. (More info can be found here: E-Learning Standards – Critical and Practical Perspectives.)

When writing more complex client side applications you will soon realize that the overall loading time increases by a factor that significantly reduces the usability and responsiveness. Thus, it would be nice to have a build script that compresses relevant JavaScript source files into optimized code that you then can include in your frontend HTML files. Additionally, you will most definitely want to be able to create different releases of your code in a simple yet consistent way. That’s were Dojo’s build system comes in handy!

Dojo Build Layer System

Before explaining the bits and pieces needed for this build script you need to understand Dojo’s build layer system. Each layer represents a list of files that should be put together to produce a compressed and optimized version, much like a C include header file. Below you find an examplary layer file for a Login module:

dojo.provide("layerLogin.js");

dojo.require("dojo.i18n");
dojo.require("custom.Login");

Line 1 declares the layer file and must match the actual file name. Following the first line are all files included in this layer. For instance, dojo.i18n represents Dojo’s internationalization (i18n) functionality and is required by this layer. Furthermore, custom.Login on Line 3 represents a file named Login.js that resides in the custom namespace and contains the Login module’s core code:

/js/custom/Login.js

The source of Login.js:

dojo.require('dijit.form.Form');
dojo.require('dijit.form.Button');
dojo.require('dijit.form.TextBox');

dojo.provide('custom.Login');
dojo.declare('custom.Login', null, {

    /**
     * Default constructor
     */
    constructor: function() {
        this.init();
    },

    /**
     * Initializes Login module.
     */
    init : function() {
        dijit.byId('loginUser').focus();
    }
});

As you can see Login represents a Dojo “class”. Although you can use plain JavaScript code in any of the included layer files Dojo classes provide you with the power of encapsulating module specific code (as you are used to when writing object oriented code). You can find more information on Dojo’s classes here: Classy JavaScript with dojo.declare

Build Layer Profile

As you might have guessed correctly we could have multiple layer files in our Dojo based application. That’s why it is good to think of layer files as modules. Consequently, the next step would be to combine all modules into a configuration profile that Dojo then can use to compress into one optimized JavaScript file. Let’s have a look at an example profile file:

dependencies = {
    layers: [
    {
        // one of the stock layers. It builds a "roll up" for
        // dijit.dijit which includes most of the infrastructure needed to
        // build widgets in a single file. We explicitly ignore the string
        // stuff via the previous exclude layer.
        name: "../dijit/dijit.js",
        // what the module's name will be, i.e., what gets generated
        // for dojo.provide(<name here>);
        resourceName: "dijit.dijit",
        // modules *not* to include code for
        layerDependencies: ["string.discard"],
        // modules to use as the "source" for this layer
        dependencies: ["dijit.dijit"]
    },
    {
        name: "../custom/layerLogin.js",
        dependencies: ["custom.layerLogin"],
        copyrightFile: "../../../../COPYRIGHT"
    },
    {
        name: "../custom/layerMain.js",
        dependencies: ["custom.layerMain"],
        copyrightFile: "../../../../COPYRIGHT"
    }],
    prefixes: [
    ["dijit", "../dijit"],
    ["dojox", "../dojox"],
    // include COPYRIGHT
    ["custom", "../custom", "../../../../COPYRIGHT"]
    ]
}

The dependencies variable at the beginning is a JSON object holding all required layers of this profile. Additionally, it defines the core files (stock layers) needed to run Dojo afterwards.

The Build Script

Now that you have an overview of the structure of a Dojo build profile let’s have a look at the actual build script. Below you find a bash script that incorporates all required steps for generating optimized JavaScript source code based on a custom build profile using Dojo’s build layer:

#!/bin/sh
#
# Script for deploying the "release" version of the currently set JS layers via the profile file.
# It will create (or overwrite) the release folder in the JS_LIB/custom/ directory,
# i.e. JS_LIB/custom/release/RELEASE_VERSION
#
# @author Matthias Kerstner <matthias@kerstner.at>
#
 
#profile file should reside inside current directory
PROFILE_FILE=custom.profile.js
 
CWD="$(cygpath -aw $(pwd))"
#UNIX: $(pwd)
 
#(absolute) path to PHP, or simply 'php' if it is in the PATH
PHP_PATH="/cygdrive/c/xampp/php/php.exe"
#UNIX: "/opt/lampp/bin/php"
 
#release number should be read from config/base.php
PHP_REQUIRE="require('$(cygpath -aw $(pwd)/../src/config/base.php)')"
#UNIX: "require('$(pwd)/../src/config/base.php');"
 
#get release version from config file
RELEASE_VERSION=`$PHP_PATH -r "$PHP_REQUIRE; echo MY_VERSION;"`
 
#determine release path
RELEASE_PATH="$(cygpath -aw $(pwd)/../src/web/js/release/)"
#UNIX: $(pwd)/../src/web/js/release/
 
# remove any previous releases
if [ -d $RELEASE_PATH ]; then
  echo "removing existing releases..."
  rm -r $RELEASE_PATH
fi
 
# create release directory again
mkdir $RELEASE_PATH
 
# path to dojo buildscript
DOJO_BUILDSCRIPT_PATH=$(pwd)/../src/web/js/util/buildscripts/
 
echo switching to buildscript path $DOJO_BUILDSCRIPT_PATH
 
#change to buildscripts folder as required by dojo's build.sh
cd $DOJO_BUILDSCRIPT_PATH
 
#call dojo's build script with profile specified above
#UNIX: build.sh
#WINDOWS: build.bat
exec ./build.bat profileFile=../../profile/"$PROFILE_FILE" \
action=release cssOptimize=comments optimize=shrinkSafe releaseName=$RELEASE_VERSION
 
#change to lib/js folder
cd $(cygpath -aw $(pwd)/../../)
#UNIX: $(pwd)/../../

This script basically does the following:

  1. Defines which profile file to use (line 11)
  2. Determines the current working directory (line 13)
  3. Sets path to PHP executable to read release version from base.php (line 17, 21 and 25)
  4. Sets release path based on release version (line 28)
  5. Removes any previous releases and creates release folder (line 32)
  6. Sets path to the build script (line 41)
  7. Executes build script from within its folder (line 51)
  8. Changes back to lib/js folder (line 55)

In order for this script to run your project needs to have a certain folder structure. I have created a reference setup which you can download from GitHub: Dojo Build Script.

Note: The script should be executed from within the scripts folder. Don’t panic if the scripts runs for a while, there are plenty of files that need to be processed during the building process. After all, you get a highly compressed layer file afterwards, right?

Feel free to customize the script to your needs 🙂