Working with Chicago Data (part 1)

We are still talking a lot of data at North Park – in particular Chicago data. So I’m going to start getting my hands dirty working with this data to build capacity for future partnerships with faculty and students. So here is the first in what I hope to be many installments of the “Working with Chicago Data” series.

Mapping Chicago’s Grocery Stores

First step: Download data from the Chicago Data Portal (https://data.cityofchicago.org/). I’m using the Grocery Store 2013 dataset for this example.

Grocery Stores – 2013

Powered by Socrata

The data itself seems pretty clean and well formatted. I’m going to use Tableau for this example because that’s the tool I’m learning right now. I opened Tableau and imported the spreadsheet from the Chicago Data Portal. I ended up creating 4 different visualizations based on this data.

The first is a map of grocery store locations. It uses the latitude and longitude from the dataset to create points. Pretty standard and vanilla.

These next map is much more interesting. It takes into account the size of the store (measured in square footage) and codes that as size and color. Larger stores have larger, darker circles.

The last two maps were just variations on the second map. One version filtered out “small stores” that were less than 10,000 square feet. The other filtered out stores with the work “liquor” in the title. On a technical levels, these filters were easy to apply. However, I’m completely aware of the cultural assumptions I’m bringing to bear here. When I (white, affluent, middle class) think about a grocery store I think about a large store that doesn’t have the word “liquor” in the title.

That’s that! It was pretty easy to get this data and put it to use in the form of a map. I used Tableau here but I could also use Excel (with the power map add) or a more specialized tool like ArcGIS.

In terms of next steps or extensions:

  1. It would be interesting to compare results using a different tool. Might be good to showcase the basic steps for using each tool.
  2. It would be very interesting to add neighborhood boundaries and/or other information such as demographic information and/or economic status. I’ll have to look at ways to incorporate this data.
  3. It would also be very interesting to combine this data with user feedback like Yelp reviews.

Sending Article Level Metadata from OJS to DOAJ

That’s a pretty scary sounding title…but the process was actually super simple. I just want to document it here for my own future reference and to share with others looking to do the same thing.

Background

In my role as Technical Advisor for the Covenant Quarterly, I oversaw the application process for our journal into the Directory of Open Access Journals. Including it here seemed important because this was the main repository for Open Access Journals and it seemed like the logical place. Once the journal was accepted – which took quite a while! – we have the option to add article level metadata to that index. Here is our journal page along with the content of the journal – https://doaj.org/toc/2380-8829 Continue reading “Sending Article Level Metadata from OJS to DOAJ”

Google Scholar Tutorial

As a way to document and share my work, I wanted to post this short online tutorial I made about using Google Scholar and the Brandel Library.  I manage the data feeds (from SFX and now from EBSCO) that make these library links possible but I also feel like I needed to do more to make these connections apparent to our users. There are a number of reasons for this:

  • First, I love Google Scholar and I find it very useful for known item searching. Given students and faculty another tool seems very helpful.
  • Second, given the movement toward Open Access, I think “open” tools like Google Scholar do a better job searching that “gray” content that traditional databases struggle with.
  • Lastly, the connections between Google Scholar and the Library are seamless and relatively transparent – which are good things! – some faculty believe that everything is “on Google Scholar” without realizing that the library is providing many of those links. So I think this is an opportunity to demonstrate value and market the library.

The tutorial making process at North Park is really quite nice – we have a dedicated terminal with a high quality microphone and specialized programs like Audacity and Camtasia that make it easy to create high quality tutorials. I’ve done several and am definitely getting better at using these tools – though I still don’t love the sound of my voice!

Tools for Data Analysis

In addition to providing the raw data to our campus community, I think the library can take a leadership role in providing the tools and expertise to mine this data into something usable and useful. However, many of the tools that are used to transform data are highly specialized and have a pretty steep learning curve. So I’m going to work to provide an overview of the tools available and focus on those that would be useful in the context of undergraduate education. Continue reading “Tools for Data Analysis”

Report on Interlibrary Loan Improvements

I recently revised CV to include the following line:

Manage interlibrary loan systems; increased the local fulfillment rate from 59% to 80% while decreasing average turnaround time

I thought it would be good to provide a little additional context for this claim by supplying some data, look at visualizing that improvement, talk about how we accomplished this change here at North Park, as well as what I learned from looking at the data. Continue reading “Report on Interlibrary Loan Improvements”

Roadmap to Library Publishing

First, I’ve determined that there is no one “roadmap” that will lead my library into digital publishing. So, instead of creating a map, I’m going to do the best I can to sketch out the terrain ahead and think about questions that can guide our path.

Broader Context

This section tries to address two main questions: What is happening in the world of scholarly publishing that is relevant to North Park? What is happening within the North Park setting that is relevant to a library published endeavor? Quick thoughts:

  • Continued movement toward Open Access. There is still work to be done in our local context but that is the clear movement. The Covenant Quarterly and Journal of Hip Hop Studies indicate this trend is taking root on campus.
  • Institutional branding. There is a renewed focus in institutional branding and online presence. There could be powerful connections to make here.
  • Publishing and the North Park mission. My sense is that North Park values diverse contributions to the academic community more than creating a specialized repository
  • Chicago. There might be some opportunities to promote North Park within the regional context through research and student projects.

Scope

We need to define the scope of this project. There are many different efforts that fall under the broad category of “digital publishing”, including:

  • Institutional Repositories
  • Digital Humanities
  • Data Repository
  • Open Educational Resources
  • Campus multimedia (lectures, performances, etc.)

Of these options, I think the most appropriate level and scope would be an institutional repository that contains simple/static documents such as PDFs. A next step would be to curate multimedia from across campus.

Even within this scope, the library will need to make editorial and collection development decisions to make sure that (1) we have a critical mass of content and (2) that there is some editorial scope. I think we should prioritize the following content areas and focus on building relationships with relevant parties.

Student Research

  • Honor’s Projects and Papers
  • Student Research
  • Master’s Thesis
  • NPPress Student Research
  • Covenant History Papers
  • Partnerships with different courses/programs.

Faculty/Staff Scholarship

  • Journal Articles
  • Faculty/Staff Presentations and other “gray” literature
  • Papers from campus symposiums
  • Offer hosting/support for existing campus projects

Political Realities/Soft Skills

We would need some strong support from across campus to take on this project and lead the campus here. Given the proposed scope of this project, here are the people I think it would be important to connect with:

  • The President
  • Provost
  • Campus Deans
  • The University Marketing and Communication Office
  • Honors Program
  • Seminary Faculty
  • Faculty/Tenure Committee
  • NPPRESS Leadership
  • Student Research Committee

Some of these needed connections blend into the next set of questions that seeks to define the scope of this project and effort. I think if we have 5 strong allies (willing to contribute the content they are responsible for) that would make a strong starting point.

Management

Do we have the technical and social workflows to produce, distribute and preserve this content? There are many overlapping questions here, but here is an attempt to list the important ones:

  • Do we have the rights/permissions to publish these materials? Who will work with each group to determine these permissions and who will maintain the paperwork?
  • Do we have the staff expertise, staff time, and faculty/staff connections to successfully manage this projects?
  • What is the ongoing cost of this project in terms of hosting costs, incentives and open access fees, etc.?
  • Where does this rank compared to other library/institutional priorities?
  • What are peer institutions doing? What can we learn from them?