Managing digital assets is now a critical part of our society. I wanted to put together some notes for a presentation on personal digital archiving that I’ve proposed for the Covenant’ MidWinter gathering of pastors. If this proposal is accepted, this post will be a first, rough draft of the content I hope to cover there. Continue reading “Personal Digital Archiving for Clergy”
I just wrote and submitted a piece to the North Park website. It reflects on a recent campus lecture about Pietism and while it’s not exactly about libraries, it is about transformation and does capture some early thoughts that might be related to my “pietism and libraries” project. So here it goes! Continue reading “The Pietist Ethos and the Fundamental Theorem of Calculus”
We are still talking a lot of data at North Park – in particular Chicago data. So I’m going to start getting my hands dirty working with this data to build capacity for future partnerships with faculty and students. So here is the first in what I hope to be many installments of the “Working with Chicago Data” series.
Mapping Chicago’s Grocery Stores
First step: Download data from the Chicago Data Portal (https://data.cityofchicago.org/). I’m using the Grocery Store 2013 dataset for this example.
The data itself seems pretty clean and well formatted. I’m going to use Tableau for this example because that’s the tool I’m learning right now. I opened Tableau and imported the spreadsheet from the Chicago Data Portal. I ended up creating 4 different visualizations based on this data.
The first is a map of grocery store locations. It uses the latitude and longitude from the dataset to create points. Pretty standard and vanilla.
These next map is much more interesting. It takes into account the size of the store (measured in square footage) and codes that as size and color. Larger stores have larger, darker circles.
The last two maps were just variations on the second map. One version filtered out “small stores” that were less than 10,000 square feet. The other filtered out stores with the work “liquor” in the title. On a technical levels, these filters were easy to apply. However, I’m completely aware of the cultural assumptions I’m bringing to bear here. When I (white, affluent, middle class) think about a grocery store I think about a large store that doesn’t have the word “liquor” in the title.
That’s that! It was pretty easy to get this data and put it to use in the form of a map. I used Tableau here but I could also use Excel (with the power map add) or a more specialized tool like ArcGIS.
In terms of next steps or extensions:
- It would be interesting to compare results using a different tool. Might be good to showcase the basic steps for using each tool.
- It would be very interesting to add neighborhood boundaries and/or other information such as demographic information and/or economic status. I’ll have to look at ways to incorporate this data.
- It would also be very interesting to combine this data with user feedback like Yelp reviews.
That’s a pretty scary sounding title…but the process was actually super simple. I just want to document it here for my own future reference and to share with others looking to do the same thing.
In my role as Technical Advisor for the Covenant Quarterly, I oversaw the application process for our journal into the Directory of Open Access Journals. Including it here seemed important because this was the main repository for Open Access Journals and it seemed like the logical place. Once the journal was accepted – which took quite a while! – we have the option to add article level metadata to that index. Here is our journal page along with the content of the journal – https://doaj.org/toc/2380-8829 Continue reading “Sending Article Level Metadata from OJS to DOAJ”
As a way to document and share my work, I wanted to post this short online tutorial I made about using Google Scholar and the Brandel Library. I manage the data feeds (from SFX and now from EBSCO) that make these library links possible but I also feel like I needed to do more to make these connections apparent to our users. There are a number of reasons for this:
- First, I love Google Scholar and I find it very useful for known item searching. Given students and faculty another tool seems very helpful.
- Second, given the movement toward Open Access, I think “open” tools like Google Scholar do a better job searching that “gray” content that traditional databases struggle with.
- Lastly, the connections between Google Scholar and the Library are seamless and relatively transparent – which are good things! – some faculty believe that everything is “on Google Scholar” without realizing that the library is providing many of those links. So I think this is an opportunity to demonstrate value and market the library.
The tutorial making process at North Park is really quite nice – we have a dedicated terminal with a high quality microphone and specialized programs like Audacity and Camtasia that make it easy to create high quality tutorials. I’ve done several and am definitely getting better at using these tools – though I still don’t love the sound of my voice!
Through my role on the Commission on Covenant History, I was able to bring Dr. Chris Gehrz to campus for a lecture and a faculty discussion. It was a wonderful set of events and has re-invigorated by exploration of pietism and libraries. Perhaps it would be more fair to say this lecture reinvigorated an exploration a pietism and my personal life…and I hope that I’m able to connect these thoughts to my professional calling as a librarian!
He started his lecture by telling “his story” so I’ve decided to start in the same place – but telling my “stories” of pietism. In reading his book – The Pietist Vision of Higher Education – and over the course of the lecture and faculty discussion, I’ve been able to reframe several key events in my educational past through the pietist lens of “transformation” or “re-birth.” So I’ll try and capture these memories using the language and spirit of pietism. Continue reading “Personal Reflections on Pietism”
An interesting side project that I thought I could write about here. A neighbor asked me about digitizing some “Wire Recordings” that her grandfather had made of her family. I had never heard of this media format so I took to twitter to ask around: Continue reading “Digitizing Wire Recordings”
In addition to providing the raw data to our campus community, I think the library can take a leadership role in providing the tools and expertise to mine this data into something usable and useful. However, many of the tools that are used to transform data are highly specialized and have a pretty steep learning curve. So I’m going to work to provide an overview of the tools available and focus on those that would be useful in the context of undergraduate education. Continue reading “Tools for Data Analysis”
We are talking a lot about data, data literacy, and how North Park University can use Chicago data in the classroom. There are already a lot of courses using data in instruction and research so part of my work is figuring out what is already happening. Continue reading “Chicago Data for Undergraduate Research”
I’m working on improving the interlibrary loan services at North Park as well as improving my skills in statistics and data visualization. I’ve combined these two interests to look at analyzing and visualizing our interlibrary loan data using Tableau. Continue reading “Interlibrary Loan Data Analysis and Visualization”