Archive for the 'Google' Category

Export Twitter Followers and Friends using a Google Spreadsheet

Recently I’ve noticed a growing number of people arrive at this blog having searched for ‘export twitter followers’. Rather than them leaving disappointed here’s a Google Spreadsheet I threw together which allows you to grab a copy of your friends/followers:

*** Google Spreadsheet to Export Twitter Friends and Followers ***

Update: added the ability to download other peoples friends/followers

Benefits of using Google Spreadsheet

  • Control - You register for your own API key with Twitter so you have full control of the account
  • Automatic whitelisting – Twitter currently automatically whitelist applications built in Google Spreadsheets this means instead of 150 API request per hour you get 20,000 meaning you can export a lot
  • Playing with the data – as you are importing straight into a spreadsheet you can do all of your own data manipulation like sorting, filtering and creating your own formula for things like follow/follower ratios
  • Backup – Google Spreadsheets allow you to download copies of spreadsheets in different formats
  • Share – You can make your lists of friends/followers easily viewable

Where’s this all going?

Having already done other things with the Twitter API and Google Spreadsheets (See Populating a Twitter List via Google Spreadsheet … Automatically!, Collect/backup tweets in a Google Spreadsheet, Google Apps Script, Spreadsheets, Twitter and Gadgets) the Twitter/Google Spreadsheet back is well and truly broken. You’ll probably see fewer posts one this area with new stuff instead I’ll probably start properly documenting the little code snippets I use (but if you have any interesting ideas you want help with get in touch).

This doesn’t mean I’ll be walking away from Google Spreadsheets. As recent posts like Turning Google Spreadsheets into a personal or group bookmarking service, show there is huge scope in using Spreadsheets as a very flexible rapid development platform.

Below are some bits of the code used in my new spreadsheet (all the code is viewable via the Script Editor in the Spreadsheet):

function tw_request(method, api_request){
  // general purpose function to interact with twitter API
  // for method and api_request doc see http://dev.twitter.com/doc/
  // retuns object
  var oauthConfig = UrlFetchApp.addOAuthService("twitter");
  oauthConfig.setAccessTokenUrl(
      "https://api.twitter.com/oauth/access_token");
  oauthConfig.setRequestTokenUrl(
      "https://api.twitter.com/oauth/request_token");
  oauthConfig.setAuthorizationUrl(
      "https://api.twitter.com/oauth/authorize");
  oauthConfig.setConsumerKey(getConsumerKey());
  oauthConfig.setConsumerSecret(getConsumerSecret());
  var requestData = {
        "method": method,
        "oAuthServiceName": "twitter",
        "oAuthUseToken": "always"
      };
   try {
      var result = UrlFetchApp.fetch(
          "https://api.twitter.com/1/"+api_request,
          requestData);
      var o  = Utilities.jsonParse(result.getContentText());
    } catch (e) {
      Logger.log(e);
    }
   return o;
}
function getFriendAndFo(sheetName){
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getSheetByName(sheetName);
  sheet.getRange(2, 1, sheet.getLastRow(), sheet.getMaxColumns()).clear({contentsOnly:true}); //clear sheet
  var cursor = "-1";
  while(cursor != "none"){ // while twitter returns data loop
    try {
      var o = tw_request("GET", "statuses/"+sheetName+".json?cursor="+cursor); // note using sheetname to build api request
      var data = o.users;
      for (i in data){ // extracting some subobjects to top level (makes it easier to setRowsData)
        if (data[i].status){
          for (j in data[i].status){
            data[i]["status_"+j] = data[i].status[j];
          }
        }
        if (data[i].screen_name){ // also build url to jump to profile page
          data[i]["profile_link"] = "http://twitter.com/"+data[i].screen_name;
        }
      }
      var headRange = sheet.getRange(1, 1, 1, sheet.getMaxColumns());
      var rowIndex = sheet.getLastRow()+1;
      setRowsData(sheet, data, headRange, rowIndex); // dump data for this loop to sheet
      if (o.next_cursor!="0"){
        cursor = o.next_cursor; // get next cursor
      } else {
        cursor = "none"; // break
      }
    }  catch (e) {
      Logger.log(e);
    }
  }
}

Turning Google Spreadsheets into a personal or group bookmarking service

Recently Yahoo announced that it was going discontinue/sell their Delicious bookmarking service. This led to a swath of messages in my networks by people looking for alternatives and strategies for getting their stuff out. 

This isn’t a post about how you can get your Delicious bookmarks into something different (most of the other bookmarky sites like Diigo and Google Bookmarks have these options), instead I using this post to look at two ways you can capture bookmarks into Google Spreadsheets.

Why Google Spreadsheets? Other than continuing my personal journey with Google Spreadsheets you could also argue this solution gives you more flexibility and control over the data.

At the core of this idea is the use of ‘bookmarklets’. Most bookmarking services provide users with a bookmark that when clicked on launches a form for confirming and collecting details about the link you want to save. The background process is fairly basic, the bookmark runs a script which opens a popup browser window, passing some details about the page you were viewing to the form.

Method 1 – using a manila Google Form

So if all the Delicious bookmarklet is doing is directing you to a form why not redirect it to your own form, a Google Form! Below if the JavaScript Delicious use (expanded to make more readable):

javascript: (function () {
    f = 'http://delicious.com/save?url=' + encodeURIComponent(window.location.href) + '&title=' + encodeURIComponent(document.title) + '&v=5&';
    a = function () {
        if (!window.open(f + 'noui=1&jump=doclose', 'deliciousuiv5', 'location=yes,links=no,scrollbars=no,toolbar=no,width=550,height=550')) location.href = f + 'jump=yes'
    };
    if (/Firefox/.test(navigator.userAgent)) {
        setTimeout(a, 0)
    } else {
        a()
    }
})()

The interesting bit is line 2 which builds the url for the Delicious save form including your current window location (i.e. the link you want to save) and corresponding page title. To use a Google Form you can just enter your form url pre filling fields by adding &entry_1=, &entry_2=. For example if you created a form with a title field and url field (1st and 2nd), your url would look something like this:

f = ‘http://spreadsheet.google.com/viewform?formkey=randomformidentifierkjflsj44&entry_2=" +encodeURIComponent(window.location.href) +"&entry_1="+encodeURIComponent(document.title);

If you don’t want to mess with bits of code I’ve made this template spreadsheet for you to copy which has a readymade form and also generates a custom bookmarklet for you to use.

Method 2 – creating a custom Google Spreadsheet form interface

Method 1 was about piecing together bits I already knew, method 2, which uses the Google Apps Script User Interface (UI) services, was about using something new. If you don’t want to read about the ‘journey’ you might want to just jump straight to grabbing a copy of the spreadsheet below. Before you disappear some things to think about:

  • You can share your bookmarlet with other people turning it into a group bookmarking account
  • I’ve just pulled the name and location of the bookmarked page, you could scrape additional page content

*** Googlicious – Google Spreadsheet Bookmarker ***

How-to Method 2

Previously in Collecting any data in a Google Spreadsheet using GET or POST by publishing as a service I highlighted how you can publish a spreadsheet as a service which allows you a different way to put data into a Spreadsheet. As part of this you can use the UI Services to build a custom interface around this. Using this method means I can programmatically create my own form interface like the one shown below:

 

Unfortunately I haven’t found a tool with a graphical interface which can help with designing custom interfaces (although as the UI Service is based on the Google Web Toolkit, there might be something I can use there). So to generate the interface above required all of this coding:

    var app = UiApp.createApplication().setTitle(
      "Save bookmark to Google Spreadsheet");
    app.setStyleAttribute("padding", "10px");
    var titleLabel = app.createLabel("Save Bookmark");
    titleLabel.setStyleAttribute("text-align", "left");
    titleLabel.setStyleAttribute("font-size", "135%");
    titleLabel.setStyleAttribute("font-weight", "bold");
    var noteWin =  app.createLabel("Bookmark saved. Please close this window.");
    noteWin.setId("noteWin");
    noteWin.setStyleAttribute("text-align", "center");
    noteWin.setStyleAttribute("font-weight", "bold");
    noteWin.setStyleAttribute("background", "#FFFF00");
    noteWin.setStyleAttribute("color", "#FF0000");
    noteWin.setVisible(false);
    var pageTitleLabel = app.createLabel(
      "Title");
    var pageTitleText = app.createTextBox();
    pageTitleText.setName("title");
    pageTitleText.setWidth("100%");
    pageTitleText.setText(e.parameter.title);
    var urlLabel = app.createLabel(
      "URL");
    var urlText = app.createTextBox();
    urlText.setName("url");
    urlText.setWidth("100%");
    urlText.setText(e.parameter.url);
    var tagsLabel = app.createLabel(
      "Tags");
    var tagsText = app.createTextBox();
    tagsText.setName("tags");
    tagsText.setWidth("100%");
    var notesLabel = app.createLabel(
      "Notes");
    var notesText = app.createTextArea();
    notesText.setName("notes");
    notesText.setWidth("100%");
    var saveHandler = app.createServerClickHandler("saveBookmark");
    var saveButton = app.createButton("Save Bookmark", saveHandler).setId("saveBut");
    var listPanel = app.createGrid(4, 2);
    listPanel.setStyleAttribute("margin-top", "10px")
    listPanel.setWidth("100%");
    listPanel.setWidget(0, 0, pageTitleLabel);
    listPanel.setWidget(0, 1, pageTitleText);
    listPanel.setWidget(1, 0, urlLabel);
    listPanel.setWidget(1, 1, urlText);
    listPanel.setWidget(2, 0, tagsLabel);
    listPanel.setWidget(2, 1, tagsText);
    listPanel.setWidget(3, 0, notesLabel);
    listPanel.setWidget(3, 1, notesText);
    // Ensure that all form fields get sent along to the handler
    saveHandler.addCallbackElement(listPanel);
    var dialogPanel = app.createFlowPanel();
    dialogPanel.add(titleLabel);
    dialogPanel.add(noteWin);
    dialogPanel.add(listPanel);
    dialogPanel.add(saveButton);
    app.add(dialogPanel);

Why oh why oh why?

So why go to all this extra effort for a very similar result. For a start it means I don’t have to use the styling of traditional Google Forms which I don’t like. More importantly it opens the possibility of making the form more dynamic by using existing data from the spreadsheet. So for example I could pull in commonly used tags or even just indicate if a url has already been bookmarked before.

As well as form creation Google Apps Script could be used to do more with the submission, like automatically tweeting it or sending it somewhere as an email or even choosing which sheet it should go to (i.e. you might have a public and private sheet). I’ll let you imagine the possibilities

TwapperKeeper looses API access, iTitle turns to the cloud with Twitter Search from Google Spreadsheets

Empty Archive
Empty Archive
Originally uploaded by bRokEnCHaRacTer

Recently Twitter has been paying close attention to 3rd party developer use of their API. First they stopped whitelisting new applications, then UberSocial and Twitdroid were suspended for terms of service violations, now the popular tweet archiving service TwapperKeeper announced that they will be removing export and API access in response to a request from Twitter.

Fortunately TwapperKeeper has been given until the 20th March to comply, but this is still a blow for 4th party developers, like myself, who build applications around the TwapperKeeper archive/API and will no longer have an easy way to grab historic tweets. Part of the problem is Twitter provides no easy access to collections of tweets over 5 days old necessitating the need for services like TwapperKeeper, so it’s discontinuation is going to cause headaches for some.

Personally, when developing my Twitter Subtitling service iTitle I was keen to make sure data could be used from various sources starting with the official Twitter Search results and TwapperKeeper integration, later adding support for .CSV file upload. Most recently I added support for my Google Spreadsheet/Twitter Search collection mashup [BTW after feedback from @AJCann there is now a setup video for this on YouTube]. So hopefully iTitle will live for a little longer.

So what about other services like my other Twitter mashup uTitle and Andy Powell’s Summarizr, which rely on the TwapperKeeper service. There are obviously other Twitter archive services out there, TwapperKeeper even has the JISC funded open source version yourTwapperKeeper, but all the online services potentially face the same problem of being prevented from making archived tweets reusable. One solution might be to develop a distributed cloud based archive. For example, initially I thought of using Google Spreadsheet hack for capturing tweets, and then sharing these archives with the community as a searchable index. Anyone fancy building an interface for this?

Collecting any data in a Google Spreadsheet using GET or POST by publishing as a service [#dev8d]

I’m at Dev8D which is a JISC funded event targeted at educational developers/coders. 

The aim of Dev8D is to bring together developers from across the education sector and wider in order to learn from one another and ultimately create better, smarter technology for learning and research. Dev8D is the major UK event in the year for educational developers from further and higher education.

I could write reams and reams about why Dev8D is an important and valuable event, instead I’m just going to share something I learned/made as a result of being here. What I’m about to show you is one of the building blocks for a bigger idea that Tony Hirst and I are chatting about and hopefully I’ll come back to that idea at a later date.

Essentially what we wanted to do was capture data to a Google Spreadsheet. There are obviously a number of ways to get data into a Spreadsheet like using Forms or fetching a url, but we needed something more flexible. Something you can do with Google Apps Script/Google Spreadsheet is ‘publish as service’.

There isn’t a whole load of documentation about this but the Google guide on Apps Script User Interfaces provides some hints on what is possible. The bit that interested me was capturing data from any HTML form using GET or POST. This means you can forgo the Google Forms interface and completely design your own forms. Here’s the handy bit of Google Apps Script code you need in your Spreadsheet to capture the data:

function doPost(e) { // if you want to use GET data use doGet(e)
  var ss = SpreadsheetApp.openById(ScriptProperties.getProperty('active')); // if you need to write anything to the sheet you'll need this to set the active sheet
  //http://www.google.com/support/forum/p/apps-script/thread?tid=04d9d3d4922b8bfb&hl=en
  var app = UiApp.createApplication();
  var panel = app.createVerticalPanel();
  for( p in e.parameters){ // this is just clycling through the parameters passed to the sheet. You can replace this to write to a particular cell
    panel.add(app.createLabel(p +" "+e.parameters[p]));
  }
  app.add(panel);
  return app;
}
//http://www.google.sc/support/forum/p/apps-script/thread?tid=345591f349a25cb4&hl=en
function setUp() { // you need to manually run this script once to collect the spreadsheet ID
  ScriptProperties.setProperty('active', SpreadsheetApp.getActiveSpreadsheet().getId());
}

Enjoy!

Google Apps Script, Spreadsheets, Twitter and Gadgets #guug11

Yesterday was the inaugural Google Apps UK User Group (#guug11) event at Loughborough University. I was put to the test with 5 minutes in the open mic session talking about my work with Google Spreadsheets and Twitter. Fortunately before I took the stage Tony Hirst had a session on Mashing Up Google Apps, which did a great job of setting the scene highlighting how Google Apps is a great environment for quickly manipulating ‘stuff’.

Given the short time and the fact I was demoing stuff in Spreadsheet instead of flipping back and forth from a presentation I decided to embed my slides with the aid of a little gadget I wrote which plays back a flickr photoset. This enabled me also prepare cells of text I could tweet at a click of button. This required a bit of refactoring of my Populating a Twitter List via Google Spreadsheet creating this more general function:

function tw_request(method, api_request){
  var requestData = {
        "method": method,
        "oAuthServiceName": "twitter",
        "oAuthUseToken": "always"
      };
   try {
      var result = UrlFetchApp.fetch(
          "https://api.twitter.com/1/"+api_request,
          requestData);
      var o  = Utilities.jsonParse(result.getContentText());
      Logger.log(result.getHeaders());
    } catch (e) {
      Logger.log(e);
    }
   return o;
}

I could then attach the following script to a button on the sheet:

function tweetBut(){
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getActiveSheet(); // get active sheet
  var tweet = sheet.getActiveCell().getValue(); //grab the active cell as item to tweet
  var api_request = "statuses/update.json?status=" + encodeURIComponent(tweet); // build query for Twitter Api
  var data = tw_request("POST", api_request); // function to send tweet
  sheet.getRange("B34").setValue(tweet); // write tweet in specified cell < formated cell to large font for eveyone to see
  Logger.log(data); // debug stuff
  if (data["text"]!=""){ // if tweet sent make a notification
    ss.toast("Tweet sent", "Tweet");
  }
}

As this original spreadsheet already had a function to interact with Twitter Search to find people using a particular hashtag, I rewrote this slightly to also allow me to pull a named search into a sheet (as used in Twitteralytics v2).

With my new tw_request() function this opens up the entire Twitter API for endless mashineering (mentions, followers, following and more). There’s more tinkering to do but here’s my latest iteration of Spreadsheet/Twitter mashup (GoogTweetSheet) – there’s a bug with the custom twitter menu not appearing –to do

Using my search here’s an archive of all the #guug11 tweets (I’ve filtered it to show the tweets I made during my presentation)

Collect/backup tweets in a Google Spreadsheet [Twitteralytics v2]

Back in June I hastily threw together Using Google Spreadsheet to automatically monitor Twitter event hashtags and more. It kind of worked most of the time but there were a lot of rough edges. So here is:

*** the Twitteralytics v2 Google Spreadsheet ***
(you’ll need to make a copy of this from the file menu before you can use - if make a copy is not available make sure you’ve logged in to Google first)

Here is a short video showing setup:

The concept is the same, you setup a search and Google Spreadsheet updates the results using a time driven trigger. The main difference is where the old version just collected date, author and tweet text this version has the option of pulling everything back available from the Twitter GET Search API. Even better you define what a where information is pulled by matching your column heading name with the Twitter search result schema, so if Twitter adds more data it’s easy for you to pull (one small caveat is it can only pull top-level data, anything nested in the results is lost apart from geo-tagging, which I thought was too useful to loose).

This was easily achieved by modifying Mikael Thuneberg getTweets() function which I originally used in version 1. This function collects search results using XML/ATOM. To achieve more flexibility I converted this to use the JSON results from Twitter. Because Google App Scripts are written using JavaScript it makes it easy to through the JSON results around and given that there is already handy bits of code for writing spreadsheet data using JavaScript Objects I was able to achieve more flexibility easily.

As my focus this time around is to just harvest tweets I dropped some of the original functionality of summarising ‘top tweeter’ and emailing results, but it’s easy enough to drop you own functions or gadgets to help you process the data. For example you could use the Map Gadget to plot geo-located tweets. This spreadsheet has some examples of what you can do with the data.

The from:briankelly sheet includes a map gadget whilst the first 5 columns of #purposed contain functions for extracting top tweeters and a Google Spreadsheet readable timestamp.

With services like twapperkeeper.com and downloadable apps like ThinkUp why have a Google Spreadsheet version? Not entirely sure I’ve got a good answer to that one. The biggest advantage is maybe its a quick way to collect tweets, make publically available and collaborate exploring the data.

Mainly for SEO here is the code for the modified version of getTweets(), the rest of the script is available in the spreadsheet.

function getTweets(searchTerm, since, until, maxResults, languageCode) {
    //Based on Mikael Thuneberg getTweets - mod by mhawksey to convert to json
    // if you include setRowsData this can be used to output chosen entries
    try {
        var pagenum = 1;
      var data =[];
      var idx = 0;
        if (typeof maxResults == "undefined") {
            maxResults = 100;
        }
        if (maxResults > 1500) {
            maxResults = 1500;
        }
        if (maxResults > 100) {
            resultsPerPage = 100;
            maxPageNum = maxResults / 100;
        } else {
            resultsPerPage = maxResults;
            maxPageNum = 1;
        }
        Logger.log(twDate(since)+" "+twDate(until));
        searchTerm = encodeURIComponent(searchTerm);
        for (pagenum = 1; pagenum <= maxPageNum; pagenum++) {
            var URL = "http://search.twitter.com/search.json"
            URL = URL + "?q=" + searchTerm;
            URL = URL + "&since=" + twDate(since);
            URL = URL + "&until=" + twDate(until);
            URL = URL + "&rpp=" + resultsPerPage;
            URL = URL + "&page=" + pagenum;
            URL = URL + "&result_type=recent";
            if (typeof languageCode != "undefined") {
                if (languageCode.length > 0) {
                    URL = URL + "&lang=" + languageCode;
                }
            }
            var response = UrlFetchApp.fetch(URL, {method:'get', headers: { "User-Agent": REFERER}});
            if (response.getResponseCode() == 200) {
              var objects = Utilities.jsonParse(response.getContentText()).results;
              for (i in objects){ // not pretty but I wanted to extract geo data
                if (objects[i].geo != null){
                  objects[i]["geo_coordinates"] = "loc: "+objects[i].geo.coordinates[0]+","+objects[i].geo.coordinates[1];
                }
                objects[i]["status_url"] = "http://twitter.com/"+objects[i].from_user+"/statuses/"+objects[i].id_str;
                data[idx]=objects[i];
                idx ++;
              }
            }
          }
    return data;
    } catch (e) {
        return e.message;
    }
} 

Linking a Google Form with data from the responses in the Spreadsheet [Event/Resource Booking]

is there a way to use a Google Form for event booking and cap entries for individual parallel sessions?

This was the question I got from my colleague Kenji Lamb who is helping organise Game To Learn: Take 2!. The event has a number of parallel sessions with limited spaces in each. Ideally Kenji wanted a booking form which would track numbers and prevent people booking into sessions which were full. So is it possible? The quick answer as demonstrated by the existence of the registration form is yes, but there was some head scratching along the way (and there is one very big caveat at the end of this post). Here’s how I did it:

Step 1: Getting live booking numbers out of Google Spreadsheet

The Google Visualization API provides a way to access spreadsheet data using their query interface (I’ve used this previously in gEVS – An idea for a Google Form/Visualization mashup for electronic voting). The problem is to do this you need to make your entire spreadsheet publically available, okay for anonymous voting but isn’t ideal if you are collecting personal information like email addresses and phone numbers. Here’s what I picked up from the documentation:

non-embedded visualization runs with the privileges of the person viewing the visualization, so the spreadsheet must either assign view access to everyone, or to the specific person running the visualization from Google Visualization Docs

The other option is to ‘Publish as a web page’ selecting a single sheet. There are a number of different formats you can choose for the data including CSV (comma separated values). I’ll come back to what you can do with the CSV data in a second.

So we have a way of getting some data out but it needs to be filtered to leave the bit you need – how many places are left. My solution was to use a series of sheets and functions to filter the data. In this spreadsheet there are 5 sheets:

  • Form Raw – this is where the form data comes into
  • Parsed Data – this strips out extra session title/day info text using
    • =IF(ISERROR(FIND(“17th May”, ‘Form Raw’!C2)),”-”,”X”) – for day; and
    • =LEFT(‘Form Raw’!D2,4) – to extract the first 4 digit identifier
  • Session Counts – counts the numbers of things using COUNTIF (I could have skipped this sheet and just gone to the free spaces but wanted an easy way for admin to tweak numbers)
  • Public – the bit were most interested, how many spaces are left
  • Templates – used for sending out email confirmation to delegates

Initially I prefilled Parsed Data with functions in the cells but for some reason when the form was being submitted it would remove the function from the equivalent Parsed Data row. Consequently I had to write an Apps Script which would programmatically insert the function (I was going to use Apps Scripts anyway to automatically send an email confirmation when someone booked. I’m not going to go into the detail of the App Script in this post, the code is in the shared spreadsheet (if you need clarification on anything use the comments below).

Step 2 Snaffling your Google Form

So we have a public sheet which has a summary of the available slots, how can we use this in the associated booking form? There is very little you can do with the form has hosted by Google, but if you know your away around html and have somewhere to host your alternate version it’s easy enough to style Google Forms by viewing your live web form then copy the source to another webpage.

Step 3 Converting CSV to something more useful instead (PHP/JavaScript)

Initially I tried grabbing the .csv data from Google using jQuery but kept getting access denied errors, so instead I resorted to PHP. Here’s my PHP code snippet. PHP handles .csv well, and it would have been easier if my host used 5.3> because I could have used str_getcsv() – there’s a workaround I modified to use the file version of this function.

I could have used PHP to dynamically write the form indicating if there were any filled sessions, instead I opted to use jQuery as that sort of thing is a lot easier. First converting a PHP array into a JavaScript object (got this trick from here):

<script type="text/javascript">
    var list=<?php echo json_encode($data); ?>;
</script>

Step 4 Updating the form

Next disabling any of the parallel session select options that are full:

function checkForm(){
$('select > option').each(function() {
        var str = $(this).text();
        var id = str.substring(0,4);
     if (parseInt(list[id]) <= 0){
         $(this).attr('disabled', true)
         $(this).text("[FULL] "+str);
     } });

Because Google do form validation server side to prevent the user submitting the form with missing data and being redirected back to an unmodified version of the form you need to validate before submission. Fortunately jQuery has a Validation Plugin which makes it easy to do validation client side. All you need to do is include

<script type="text/javascript"
    src="http://ajax.microsoft.com/ajax/jquery.validate/1.7/jquery.validate.min.js"></script>

in the head of the page then initialise validation with:

$("#yourFormName").validate();

To make parts of the form required just add the class=”required” to the form elements.

Some other things you can do in Google Spreadsheet (I did not know that)

First off Google have added some nice email notification options to Google Spreadsheet. These let you setup immediate or digest notification for changes to the spreadsheet or other events.

image

Second thing I discovered is that Apps Script are by default only editable by the person who first added it to the spreadsheet (for others who open the script editor they can see the script but there is a little padlock next to the script name). To allow any of the spreadsheet collaborators to edit you need to go into Tools > Scripts > Script Editor > Share > Share settings.

Third thing, if you set up a script trigger that includes sending emails they will be addressed as coming from the account of who ever setup the trigger. I decided to spare myself queries from Game To Learn delegates to my personal gmail account by sharing the spreadsheet with one of our catchall email addresses, then using this one to create the email trigger.

The big caveat

As the public data like shown on this page is only updated every 5 minutes there is a chance that the session could go over quota if bookings are made in quick succession. If anyone knows a way around this I would be very grateful ;-).

Finally if you want to have a go I’ve setup an example form here. You can see how the validation works and if you select session ABC1 and wait 5 minutes before refreshing the page ~:-S you can see what happens when a session is full (for this example the spreadsheet keeps adding an additional space every 10 minutes)

gEVS – An idea for a Google Form/Visualization mashup for electronic voting

First I should say I don’t think this is the best solution, in fact an earlier post from 2008 DIY: A wi-fi student response system is probably a solution, if perhaps needing more tidying up, but I’m posting anyway just on the of chance that this might inspire or help solve someone else’s problem.

This post has come about as a result of a couple of things:

  1. I’m in a bit of a Google Apps run.
  2. I read and enjoyed Donald Clarks Clickers: mobile technology that will work in classes
  3. I saw and contributed to Tom Barrett’s crowdsourced X Interesting Ways to Use Google Forms in the Classroom (I added #39 Collaboratively building a timeline which I discovered through Derek Bruff’s class example.

Concept: Using Google Forms as an interface for a mobile/desktop based voting system.

Issue: If you are asking multiple questions the form needs prepopulating with options making it a long list for the student to navigate and potentially creating a predefined route of questions.

Solution: Use a generic form with one set of response options, the class using a unique question identifier for response graphs to be generated from.

The finished result

Below (if you aren’t reading this via RSS) is this Google Form. [You can make a copy of the related Spreadsheet and customise the text and options. For example, you might want to turn qID into a list option rather than free text.]

Loading…

And here is a page with a summary of responses, which allows the user to choose which response set to display (screenshot shown below):

Screenshoot of summary of responses

How it was done

Some of you might already be familiar with Google Chart. This service allows you to create chart images by encoding the data in the URL. I’ve used this service in a number of my mashups, in fact all of my previous voting mashups use it in some way, and not surprisingly in Generating charts from accessible data tables using the Google Charts API.

Google Chart works well if it easy for you to get the data and format it for the image URL. For more complex tasks there is Google Visualization. The advantage of Visualization is it gives you a way of querying a data source before displaying as a table or chart. To see what you can do (and the place where I aped most of the code for this mashup) you should visit the interactive gallery of Visualization examples.

Using the Using The Query Language example as a stating point I could see you could lookup data from a Google Spreadsheet and filter the response using Google Visualization API Query Language, which is very similar to SQL. What I wanted to do was SELECT the data from the spreadsheet WHERE it matched a question identifier and COUNT the number of occurrences for each GROUP of response options. An extract from the table of data is:

ABC
Timestamp qIDAnswer
-q1A
-q1B
-q1A

My attempts to convert the SQL version of this query which is something like:

SELECT C, Count(C) AS CountOfC WHERE B = ‘questionID’ GROUP BY C

initially I was left with keyboard shaped indentations on my forehead trying to get this to work but Tony Hirst (@psychmedia) was able to end my frustration with this tweet. This meant I was able to use the following query VQL friendly:

SELECT C, Count(B) WHERE B = ‘questionID’ GROUP BY C

The next part of the problem was how to let the user decide which question ID they wanted to graph. Looking at the Simple Visualization example I could see it would be easy to iterate across the returned data and push out some html using JavaScript. What I wanted to do was GROUP the questionID’s and COUNT the number of responses, which is possible using:

SELECT B, Count(C) GROUP BY B

This returns a table of unique question IDs and a separate column with a count of the number of responses. A form list element is populated with the results using:

for (i=0; i<data.getNumberOfRows(); i++){
  var ansText = data.getValue(i, 0)+' (No of votes '+data.getValue(i, 1)+')';
  var valText = data.getValue(i, 0);
  ansSet.options[ansSet.options.length]=new Option(ansText,valText);
}

And that’s it. If you want to play around with this the code is here. Enjoy and if you find this idea useful or if you spot any issues as always I value your comments.

Cooking: A Custom Search Engine for … with Edublog 2010 Nominees

In my spare time I like to do a bit of cooking. I’m mainly a recipes kind of a guy, using the experience of recreating something tried and tested to build a foundation of knowledge to explore some of my own personal creations.

The same is true for my interest in educational mash-ups. Trying out other peoples ideas to improve my basic comprehension as well as looking for opportunities for new twists.       

Tonight I’ve been faithfully recreating Tony Hirst’s A Custom Search Engine for the Computer Weekly IT Blog Awards 2010 Nominees using the list of shortlisted nominations for the 2010 Edublog Awards.

Like Tony’s experience with the Computer Weekly awards Edublogs have gone down the route of just having a linked list of nominations in each of the categories. It would have been nice to have an OPML file for the available feeds but with almost 600 shortlisted nominations I guess their focus is on other priorities.

Here’s how I got on. One of Tony’s required ingredients is a lists of nominated blog urls. Fortunately there is a list of these on the Edublogs homepage but the urls are behind text links so copying and pasting into Excel hides the url:

Excel Spreadsheet of 2010 Edublog award nominations

Not wanting to individually copy the url from 600 links I looked to see if there was a way of using a formula to get the links. I didn’t find a formula but ExcelTips.net has a handy macro for Extracting URLs from Hyperlinks. I’m not going to go into macros but if it is something you would like to find out about I’m sure ExcelTips is an ideal place to start. The macro they suggest is:

Sub ExtractHL()
    Dim HL As Hyperlink
    For Each HL In ActiveSheet.Hyperlinks
        HL.Range.Offset(0, 1).Value = HL.Address
    Next
End Sub

 

This cycles through the sheet and for each hyperlink it find it puts the link address in the cell next to it.

Excel spreadsheet with hyperlink addresses extracted

With this I was able to get back to Tony’s recipe. When it came to Step 2 Refinements I hit a bit of a snag as there appears to be a limit of 16 possible refinements in Google CSE and 23 Edublog award categories. My solution to merge the less bloggy type categories into ‘Other’.

In Step 4 Preparing the URLs I also went my own way, modifying Tony’s:

=IF(RIGHT(B2)=”/”,CONCATENATE(B2,”*”),IF(OR(RIGHT(B2,4)=”.com”,RIGHT(B2,6)

formula to:

=IF(RIGHT(B2)="/",B2&"*",B2&"/*")

[basically if end of the url has ‘/’ add ‘*’ else add ‘/*’. This catches .com and .co.uk but doesn’t work if the url ends with a querystring or .file_extension – manual sweep used to pick up any problems]

Here’s the resulting TSV file I uploaded for more example. With Google Spreadsheet you can output in different formats including TSV by changing the output=txt e.g. here’s the same sheets as TSV which got me wondering if you could just point CSE to a Google Spreadsheet. There are options in CSE to host your own annotations XML or point to a feed but I can’t see a way for hosting it as Spreadsheet (I did come across the csesheet project which lets you configure your custom search engine through a Google Spreadsheet, but will let someone else look into that one.

So here is the fruits of my labour a Google Custom Search Engine of the 2010 Edublogs Award nominations and of course using my How to Google Instant(ise) a Custom Search Engine (CSE) the

Instantised Google Custom Search Engine of the 2010 Edublogs Award nominations

Populating a Twitter List via Google Spreadsheet … Automatically! [Hashtag Communities]

Previously in  Google Apps Script: Using a Google Spreadsheet to populate a Twitter list [Hashtag Communities] I show how you can use Google Spreadsheets to populate a Twitter list. A comment on this post by Colin Gray got me thinking, instead of having to drop a list of usernames from another source, what if the spreadsheet could use the Twitter Search to pull usernames and populate a list. As I mentioned in my response to Colin I have already documented pulling Twitter searches in Using Google Spreadsheet to automatically monitor Twitter event hashtags and more. So here it is:

*** The Auto Add to Twitter List Google Spreadsheet ***

The first sheet of this Spreadsheet has setup instructions. If you get into any problems leave a comment in the box at the end of this post. One thing I’ll say is be patient some operations can take over a minute if you having lists with lots of users.

Use case

The most obvious use case is to use this spreadsheet to generate a real-time conference hashtag community list. Variations on this might be to generate a list as you speak but this might be more gimmicky than of practical use.

Advance search

It’s worth noting that you are not limited to just searching for hashtags. The search terms used in the Spreadsheet can include most of the Twitter Search Operators (‘since’ is the only one that the Spreadsheet overrides). For example your search term could be ‘to:mhawksey near:Edinburgh within:400mi’ which would collect a list of users who tweet me within 400 miles of Edinburgh. 

The ‘techie bit’

So what’s new? The two main additions are utilising: Time-Driven Triggers which are part of Google Apps Script (these enable you to run Scripts on a Spreadsheet even when it is not open); and Mikael Thuneberg’s getTweets() function which is in the Script gallery.

The getTweets function uses the Apps Script UrlFetchApp.fetch to return results from the Twitter Search page.This is a pretty powerful tool for other mashineering as you can call a webpage using different methods (‘post’, ‘get’, ‘put’, ‘delete’).

The way I’ve modified Mikael’s script is to return JSON rather than XML data from Twitter (BTW Brian Kelly has an interesting post on Moves Away From XML to JSON?, my response is it is application specific, in this particular case it was easier to get JSON). I also added some code which only returned a unique array of twitter usernames from the search.

That’s about it really. If you have any questions about this script just drop them in the comments.

PS A big thank you to Lilian Soon (@xlearn) and Shri Footring (@ShriFootring) for help with testing/debugging

PPS I’m giving a face-to-face presentation as part of the JISC Winter Fayre (#jiscwf) looking some of the ways you can use Twitter to support the backend of teaching and learning on the 10th December at 12:20pm GMT. I’ll hopefully be streaming the session live via Livestream so please join me if you like.

Festive Tweets: Exploring the Utility of Twitter to Support Teaching and Learning

From passive to active communication, from one-way broadcast to two-way collaboration, there are a number of ways Twitter can be used to support teaching and learning. In this session we highlight some of Twitter’s hidden treats and explore how Twitter can be used as a free SMS broadcast system; how tweets from mobiles and other devices can be used as an alternative to electronic voting systems; how you can create and control a system to automatically tweet resources and announcements; how tweets can be used for lecture capture enhancement; and how Twitter can be used to leave timeline-based comments on videos.

About

This blog is authored by Martin Hawksey e-Learning Advisor (Higher Education) at the JISC RSC Scotland N&E.

mhawksey [at] rsc-ne-scotland.ac.uk | 0131 559 4112 | @mhawksey

JISC RSC Scotland North & East logo

If you would like to subscribe to my monthly digest please enter your email address in the box below (other ways are available to subscribe from the button below):

Subscribe to MASHe to monthly email updates

Loading...Loading...


The MASHezine (tabloid)

It's back! A tabloid edition of the latest posts in PDF format (complete with QR Codes). Click here to view the MASHezine

Preview powered by:
Bluga.net Webthumb

The MASHezine (eBook)

MASHe is also available in ebook and can be downloaded in the following formats:

Powered by NEWSTOEBOOK.COM

Archives

Opinions expressed in this blog are not necessarily those of the JISC RSC Scotland North & East.

JISC Advance Logo

JISC Advance is a new organisation that brings together the collective expertise of established JISC services:

For further information visit www.jiscadvance.ac.uk

Creative Commons Licence
Unless otherwise stated this work is licensed under a Creative Commons Attribution-ShareAlike 2.5 UK: Scotland License