Electra Atlantis: Digital Approaches to Antiquity


Tom Elliott (tom.elliott@nyu.edu)

This feed aggregator is part of the Planet Atlantides constellation. Its current content is available in multiple webfeed formats, including Atom, RSS/RDF and RSS 1.0. The subscription list is also available in OPML and as a FOAF Roll. All content is assumed to be the intellectual property of the originators unless they indicate otherwise.

September 22, 2014

Charles Ellwood Jones (AWOL: The Ancient World Online)

Open Access Monograph Series: Ur Excavations

Publications of the Joint Expedition of the British Museum and of the University Museum,University of Pennsylvania, Philadelphia to Mesopotamia: Ur Excavations

Open Access Journal: Classics Convivium Newsletter

 [First posted in AWOL 2 January 2011. Updated 22 September 2014 (new URLS]

Classics Convivium Newsletter
Internationally renowned for its scholarly excellence and its graduate programs, the Department {of Classics, University of Michigan] is also deeply committed to the education of undergraduates at the University. Faculty and students work closely with the Kelsey Museum and its collection of antiquities and the Papyrus Collection in the Harlan Hatcher Graduate Library.
Fall 2013
In this issue:
  • Gabii Update
  • Stamboulidis Fund for Exploring Classical Arts and Culture
  • 2013 Jerome Lecture Series
  • Kate Bosher will be Missed
  • Roman Error Conference
  • Bruce Frier "Retires"
  • Rebecca Sears, ProQuest Distinguished Dissertation Award
  • Carrie Arbour Study Abroad 2013 Scholarships
Spring 2012                      
In this issue:
  • Teaching Medical Terminology
  • Latin for Everybody
  • Platsis Review
  • Francis W. Kelsey
  • First Year Writing Seminar
  • Inter Versiculos
  • CFC Translation
  • Honors Thesis Writers
  • 2012 Else Lecture
Spring 2011
In this issue:
  • Chair’s Letter
  • 2010 Platsis Review
  • Elizabeth Kovach Fund
  • Research
  • Domestic Space in Classical Antiquity
  • P. Asso, E. Heiden & S. Hutchings Senior Honors theses
Spring 2010
In this issue:
  • Constantine Cavafy
  • Letter from the Chair
  • Archaeology Conference
  • Fiat/Chrysler Scholars
  • Gabii Project
  • Greeks and Barbarians
  • Lecture Series

Winter 2009
  • Anatomy Lesson
  • Chair's Letter
  • Platsis Symposium
  • Jerome Lecture
  • The Argument
  • Gabii Project
  • Roma Viva
  • Grad Student Conference
Fall 2008
  • Chicks wi th Bricks–Warrior Women
  • Colchis
  • Latin Teaching
  • Phi lomel
  • Facult y & Graduate Student News
Winter 2008
  • From the Chair 
  • Indo European Language and Culture 
  • Platsis Symposium
  • Tapinocyba cameroni 
  • Students at Large

Bookmark and Share so Your Real Friends Know that You Know

Cultural Heritage Informatics Initiative

CHI Fellow Introduction: Christine Neejer

When we think about business and industry in the nineteenth-century United States, a few archetypes come to mind: the wealthy tycoon, the factory worker, the inventor, and the small business owner. Most of us usually imagine these people as men. This is not an accident, but a result of what we learned in high school and college history courses about nineteenth-century life. Images of young girls working in textile mills may come to mind, but rarely do we picture nineteenth-century women filing patents of their own inventions, running a store or building complex machinery. Yet, with a little detective work, one can find a variety of sources which showcase the diversity of women’s engagement with business in the nineteenth-century.

I uncovered a striking example of this while working on my dissertation. I am a fourth year doctoral student in history. My fields of study are Women’s and Gender History, United States History and the History of Science and Medicine. My specific area of research is women’s activism in the nineteenth century, and my dissertation is on women’s bicycling practices. I personally became interested in cycling when I moved to Vermont after college. I never considered cycling as a subject worthy of scholarship until I was in my MA program in Women’s and Gender Studies at the University of Louisville. In a seminar on the history of women’s health, my professor encouraged me to think critically about the power of recreation and leisure in women’s lives. It did not take long to find a treasure trove of sources on women’s bicycling in the nineteenth-century, but I was surprised to find how few historians and scholars have studied it. My research paper for that class became my MA thesis for the program, which then morphed into the framework for my dissertation at MSU.

Last year when I was working my way through newspaper coverage of women’s cycling in the 1890s, I was surprised to find short articles and brief mentions of women involved in the production side of the bicycling industry. The historiography of women’s cycling is quite limited, and scholars who have worked in this area have largely understood women only as consumers of this new and popular technology. Yet I found evidence that women were not simply buying and riding bikes, but also designing and building them. My project, titled “Wheelwomen at Work,” will provide a digital showcase of women’s involvement in the bicycle industry from the 1880s to the 1910s, the peak years of cycling before the rise of the automobile. I hope to document women’s longstanding efforts in the bicycle industry and provide a glimpse of what is lost when we assume only men were active participants in the business ventures of this era.

Source: Journalism Code, Context & Community

Event Roundup, Sept 22

By Erika Owens

Event Roundup, Sept 22

September 25-27 is the Online News Association. OpenNews will be on the Midway and 2013 Knight-Mozilla Fellow Sonya Song will be presenting.


Know of any upcoming fellowship or conference proposal deadlines? Have an upcoming event? Let us know: source@mozillafoundation.org.

Charles Ellwood Jones (AWOL: The Ancient World Online)

Open Scriptures: Platform for the development of open scriptural linked data and its applications

Open Scriptures: Platform for the development of open scriptural linked data and its applications

Open Scriptures seeks to be a comprehensive open-source Web repository for integrated scriptural data and a general application framework for building internationalized social applications of scripture. An abundance of scriptural resources are now available online—manuscripts, translations, and annotations are all being made available by students and scholars alike at an ever-increasing rate. These diverse scriptural resources, however, are isolated from each other and fragmented across the Internet. Thus mashing up the available data into new scriptural applications is not currently possible for the community at large because the resources’ interrelationships are not systematically documented. Open Scriptures aims to establish a scriptural database for interlinked textual resources such as merged manuscripts, the differences among them, and the links between their semantic units and the semantic units of their translations. With such a foundation in place, derived scriptural data like cross-references may be stored in a translation-neutral and internationalized manner so as to be accessible to the community no matter what language they speak or version they prefer.

Open Scriptures is all about Linked Data for scripture. Please watch Tim Berners-Lee‘s TED talk on “The next Web of open, linked data.” As Zack Hubert said at the BibleTech:2008, “It’s a community effort. Any time anything good happens, is because a real cool team of people have come together around an idea.” Open Scriptures seeks to be such a community effort.

The Ancient Saar Project: London-Bahrain Archaeological Expedition

The Ancient Saar Project: London-Bahrain Archaeological Expedition
Robert C Killick, 2007
The excavations at Saar, Bahrain, took place between 1990 and 1999. The work was undertaken by the London-Bahrain Archaeological Expedition which was formed specifically for the purpose. The UK academic supporter of the project was the Institute of Archaeology, University College, London. In Bahrain, the Expedition received the patronage of the Amir, the late Shaikh Isa bin Sulman Al-Khalifa. The project was funded mainly by the business community in Bahrain, as well as by the British Academy and other academic funding bodies (for a full list, see Appendix 1 in Saar Volume 3). The directors were Robert Killick, Jane Moon, and Harriet Crawford (for the period 1990-5).

The site of Saar had been previously excavated by a Jordanian expedition (results unpublished) and it was clear from that work that the settlement was inhabited for part of the Early Dilmun period (late third and early second millennia BC) and then abandoned. This corresponds to the period of Bahrain's involvement in Arabian Gulf trade when commodities were shipped through Bahrain, from Oman and the Indus, on their way to the cities of southern Babylonia, and vice versa. A procedure documented, if patchily, in the cuneiform records of the time.

Nearly all previous archaeological work on this period prior to the excavation of Saar has focused on the extensive burial mounds and on isolated temples. The importance of Saar, therefore, resides in the fact that it is the first (and currently only) Early Dilmun settlement to have been investigated in any detail. This importance has been recognised by the Government of Bahrain which has placed the 'Saar Heritage Park' on the Tentative World Heritage List.

The Saar Settlement

The settlement itself is located on a small but prominent eastern outcrop of a limestone ridge which provides about the only natural elevation in the northern part of Bahrain. Immediately west of the settlement, and on the highest part of the ridge, is the Saar burial field, while to the south there are two cemetery complexes of interconnected graves. The settlement is spread over an estimated area of between 15,000 and 23,000 sq m, of which 7,500 sq m was excavated by the end of the project. Saar is a well laid out settlement with a main street running up from the southeastern outskirts; a temple in the centre at the crossroads of the settlement; and two- and three-roomed buildings, constructed in rows (e.g. Block A) with standard room plans and suites of domestic installations. Over 80 buildings, mainly houses, were investigated by the Expedition, as well as a well and a gypsum kiln. All these are described in detail in Saar Report 3. On the basis of pottery comparisons, the duration of the settlement is currently estimated to be about 250 years, from 2100 to 1850 BC approximately. 


Saar Excavation Report 2 Saar Excavation Report 2
Early Dilmun Seals from Saar, fragments of art and administration
by Harriet Crawford
    Full Report PDF 6 Mb
Saar Excavation Report 3 Saar Excavation Report 3
The Early Dilmun Settlement at Saar  
by Robert Killick and Jane Moon
    Full Report PDF 45 Mb

Database Documentation:

Database Documentation PDF 54 Kb
Entity Relationship Diagram JPG 48 Kb

Database Tables:

Installation codes CSV 1 Kb
Bldg codes CSV 1 Kb
Pottery periods CSV 2 Kb
Room types CSV 1 Kb
Block levels CSV 6 Kb
Bldg measurements CSV 4 Kb
Site period CSV 1 Kb
Pot versus strat CSV 5 Kb

JPG Images:

Thumbnail of Bahrain map Bahrain map
Early Dilmun sites in Bahrain
JPG 53 Kb
Thumbnail of Cemeteries Cemeteries
The Northern Burial Complex
JPG 266 Kb
Thumbnail of Double chamber burials Double chamber burials
Two-tier Early Dilmun burials along the southwestern edge of the Saar mound-field
JPG 267 Kb
Thumbnail of Honeycomb cemetery Honeycomb cemetery
Part of the Southern Burial Complex
JPG 313 Kb
Thumbnail of Middle East map Middle East map
Southwestern Asia
JPG 153 Kb
Thumbnail of Saar area map Saar area map
Archaeological remains in the Saar area
JPG 155 Kb
Thumbnail of Settlement limits Settlement limits
Location and extent of the Saar settlement
JPG 118 Kb
Thumbnail of Site aerial Site aerial
The Early Dilmun settlement at Saar from the air, taken in 1993 (S)
JPG 272 Kb
Thumbnail of Site and eastwards Site and eastwards
The plain to the east of the settlement (E)
JPG 350 Kb

eClassics Forum

Anitkythera Shipwreck

Just as much about the archaeological discoveries as it is the advancements:

Source: http://www.dailymail.co.uk/sciencetech/article-2764986/Could-ancient-computer-lie-beneath-sea-Archaeologists-return-shipwreck-2-200-year-old-Antikythera-mechanism-found.html

(Some parts got lost in the copy and pasting process so for all the goods of the article, including the unveiling of the Exosuit, use the link above.)

Could another ancient computer lie beneath the sea? Archaeologists return to shipwreck where mysterious 2,200-year-old Antikythera mechanism was found

  • Antikythera Mechanism was recovered in 1900 from a shipwreck in Greece
  • It was created in 100BC, and is believed to be the world’s oldest calculator
  • Scans revealed it was used to chart the movement of planets and the passing of days and years
  • Divers are now using a revolutionary suit to further explore the wreckage
  • The Exosuit lets them more than double the depth they can dive at
  • It also means they can grasp, clench and dig for ‘several hours’ at a time
  • Archaeologists are hoping to find other artefacts in and around the wreck - as well as a second shipwreck


More than a century since one of the most remarkable scientific objects of antiquity was discovered, experts are hoping to reveal more secrets of the deep using the latest in diving technology.

Greek and American archaeologists are returning to the ancient shipwreck of Antikythera using the Exosuit - a state-of-the art, deep sea diving suit - that will let them dive to more than double the depths of previous expeditions.

Here, the so-called Antikythera Mechanism, a 2nd-century BC device dubbed the world's oldest computer, was discovered by sponge divers in 1900 off the remote Greek island.

Scroll down for video 

Archaeologists, including Brendan Foley (pictured) are returning to the ancient shipwreck of Antikythera using the Exosuit, which lets them dive to more than double the depths of previous expeditions. It was the site of the Antikythera Mechanism, a 2nd-century BC device dubbed the world's oldest computer, found in 1900

Archaeologists, including Brendan Foley (pictured) are returning to the ancient shipwreck of Antikythera using the Exosuit, which lets them dive to more than double the depths of previous expeditions. It was the site of the Antikythera Mechanism, a 2nd-century BC device dubbed the world's oldest computer, found in 1900

The highly complex mechanism consisting of up to 40 bronze cogs and gears was used by the ancient Greeks to track the cycles of the solar system.

It was so advanced, it took another 1,500 years for an astrological clock of similar sophistication to be made in Europe.


The Exosuit, built in Canada by Nuytco Research, lets divers reach depths of 492ft (150 metres). 

It is made of aluminium, with 18 joints in the arms and legs. 

The suit is able to supply oxygen for up to 50 hours, and maintains communication with the surface via an optical cable. 

It also has four 1.6 horsepower thrusters on the back to help the diver move around underwater at relatively high speeds.

Each suit weighs between 35 (226kg) and 42 stone (272kg).

Prices start at around £360,000 ($588,000).

Now archaeologists returning to the wreck will be able to use the Exosuit to more than double the depth they can dive at, and stay safely at the bottom for longer.

The Exosuit, built in Canada by Nuytco Research, lets divers reach depths of 492ft (150 metres), while still performing delicate tasks, said archaeologist Theotokis Theodoulou.

Up until now, divers had only been able to operate at a depth of 196ft (60 metres). 

The suit, which makes the wearer resemble Buzz Lightyear, ‘expands our capabilities’, continued Mr Theodoulou, and ‘I'll be able to grasp, pluck, clench and dig... for several hours,’ he added.

Archaeologists believe many other artefacts are yet to be discovered in and around the wreck. 

The Mechanism was found with a bronze statue of a youth in the wreck of a cargo ship apparently carrying booty to Rome, and researchers are certain that other items on board still remain to be discovered.

‘We have good signs that there are other objects present,’ said Angeliki Simosi, head of Greece's directorate of underwater antiquities, after exploratory dives in the area in 2012 and 2013. 

‘There are dozens of items left, this was a ship bearing immense riches from Asia Minor,’ added Dimitris Kourkoumelis, another archaeologist on the team.

The Mechanism (pictured) was recovered from a Roman cargo shipwreck off the Greek island of Antikythera. Previous studies have shown it was used to chart the movement of planets and the passing of days and years. Scans in 2008 found that it may also have been used to predict eclipses

The Mechanism (pictured) was recovered from a Roman cargo shipwreck off the Greek island of Antikythera. Previous studies have shown it was used to chart the movement of planets and the passing of days and years. Scans in 2008 found that it may also have been used to predict eclipses

The Exosuit (pictured) built in Canada by Nuytco Research, lets divers reach depths of 492ft (150 metres). It is made of aluminium, with 18 joints in the arms and legs. It also has four 1.6 horsepower thrusters on the back to help the diver move around underwater at relatively high speeds

The Exosuit (pictured) built in Canada by Nuytco Research, lets divers reach depths of 492ft (150 metres). It is made of aluminium, with 18 joints in the arms and legs. It also has four 1.6 horsepower thrusters on the back to help the diver move around underwater at relatively high speeds

The archaeologists also hope to confirm the presence of a second ship, some 820ft (250 metres) away from the original discovery site.

Antikythera, which now has a population of only 44, was on one of antiquity's busiest trade routes, and a base for Cilician pirates, some of whom once captured and held the young Julius Caesar for ransom.

He later had them all captured and crucified.

The Greek team is assisted by Brendan Foley, a marine archaeologist from Woods Hole Oceanographic Institution at Massachusetts, which was involved in a dive to the wreck of the Titanic.

Foley has helped in outings to identify ancient shipwrecks over the last five years.

Antikythera (highlighted) which now has a population of only 44, was on one of antiquity's busiest trade routes, and a base for Cilician pirates, some of whom once captured and held the young Julius Caesar for ransom. He later had them all captured and crucified

Antikythera (highlighted) which now has a population of only 44, was on one of antiquity's busiest trade routes, and a base for Cilician pirates, some of whom once captured and held the young Julius Caesar for ransom. He later had them all captured and crucified


The Mechanism was recovered in 1900 from the Antikythera wreck - a Roman cargo shipwreck off the Greek island of Antikythera.

It was discovered in a wooden box measuring 13 inchesx7 inchesx3.5 inches (340×180×90mm) and consists of bronze dials, gears and cogs.

A further 81 fragments have since been found containing a total of 40 hand-cut bronze gears. 

The mechanism is said to have been created in around 100BC, and is believed to be the world’s oldest calculator.

Previous studies have shown that it was used to chart the movement of planets and the passing of days and years.

More than 80 fragments of the Mechanism have been found, containing a total of 40 hand-cut bronze gears (pictured)

More than 80 fragments of the Mechanism have been found, containing a total of 40 hand-cut bronze gears (pictured)

Scans of the mechanism in 2008 found that it may also have been used to predict eclipses, and record important events in the Greek calendar, such as the Olympic Games.

Astronomer Professor Mike Edmunds of Cardiff University said at the time: 'It is more complex than any other known device for the next 1,000 years.'

The scans also revealed the mechanism was originally housed in a rectangular wooden frame with two doors, covered in instructions for its use.

At the front was a single dial showing the Greek zodiac and an Egyptian calendar.

On the back were two further dials displaying information about lunar cycles and eclipses.

The calculator would have been driven by a hand crank.

The mechanism recorded several important astronomical cycles known to the Babylonians hundreds of years before that help predict eclipses.

These include the Saros cycle - a period of around 18 years separating the return of the moon, Earth and sun to the same relative positions. 

The device could track the movements of Mercury, Venus, Mars, Jupiter and Saturn - the only planets known at the time, the position of the sun, and the location and phases of the moon.

The researchers have been able to read all the month names on a 19-year calendar on the back of the mechanism.

The month names are Corinthian - suggest that it may have been built in the Corinthian colonies in north-western Greece or Syracuse in Sicily.

The device was created at a time when the Romans had gained control of much of Greece.

The Mechanism is on display at the National Archaeological Museum in Athens. 

‘We may find one or more monumental statues that were left behind in 1901, in the mistaken belief that they were rocks,’ Foley said.

As well as the new Exosuit, the Antikythera expedition will also use robot mapping equipment and new advanced closed-circuit ‘rebreathers’, which will allow divers much more time underwater.

‘We will have more bottom time than any previous human visitors to the site, because we dive with mixed gas rebreathers,’ the expedition's website said.

‘Each diver will have more than 30 minutes of bottom time per day, and will enjoy greater mental acuity and a larger safety margin than that of previous divers at Antikythera.’


Charles Ellwood Jones (AWOL: The Ancient World Online)

Acacdemia Fiorentina di Papirologia e Studi sul Mondo Antico Biblioteca On Line

[Originally posted 16 February 2011. Updated 22 September 2014]

Acacdemia Fiorentina di Papirologia e Studi sul Mondo Antico Biblioteca On Line
In questa pagina sono pubblicati testi di particolare interesse scientifico.
Lo scaricamento dei documenti è gratuito.

E' consigliabile scaricare i testi con il procedimento: click destro sul link e scegliere "Salva destinazione con nome..." o "Salva oggetto con nome...".

Description de l'Égypte ou recueil des observations et des recherches qui ont été faites en Égypte pendant l'expédition de l'armée française. (Formato: Pdf)

Antiquités, memoires. Tome premier.
Pag 1-86
(30 Mb)
Pag 87-172
(30 Mb)
Pag 173-274
(37 Mb)
Pag 275-368
(35 Mb)
Pag 369-438
(26 Mb)
Pag 439-512
(26 Mb)
Pag 513-606
Pag 607-716
(42 Mb)
Pag 717-824
(48 Mb)
Pag 825-870
(18 Mb)

Tome second.
Pag 1-80
(32 Mb)
Pag 81-176
(40 Mb)
Pag 177-252
(37 Mb)
Pag 253-300
(15 Mb)

Antiquités, descriptions. Tome premier.
Pag 1-70
(28 Mb)
Pag 71-143
(30 Mb)
Pag 144-220
(30 Mb)
Pag 221-362
(60 Mb)
Pag 363-487
(51 Mb)
Pag 488-634
(61 Mb)
Pag 635-736
(40 Mb)

Tome second.
Pag 1-80
(34 Mb)
Pag 81-176
(44 Mb)
Pag 177-302
(60 Mb)
Pag 303-452
(71 Mb)
Pag 453-500
(22 Mb)
Pag 501-664
(75 Mb)
Pag 665-746
(38 Mb)

Antiquités, antiquités.
(13 Mb)
(13 Mb)
(10 Mb)
(10 Mb)
(13 Mb)
Atlas Géographique.
(9 Mb)

Lettere dall'Egitto
In memoria di Ermenegildo Pistelli
Società Italiana per la ricerca dei papiri greci e latini in Egitto
Firenze - presso la casa editrice Le Monnier MCMXXVIII
(17 Mb)

Liste des tells et koms à Sebakh
Extrait du Journal Officiel du gouvernement égyptien du samedi 19 février 1910
Le Caire - Imprimerie de L'Institut Français d'Archéologie Orientale - M DCCCC XV
(5 Mb)

La corrispondenza di Heronino nei papiri fiorentini
(Osservazioni e note critiche ai testi)
Tesi di Laurea di Menotti Stanghellini
Relatore: Chiar.mo Prof. V. Bartoletti
Anno Accademico 1957-58
(35 Mb)

Ernst Khün Antinoopolis Göttingen 1913  (52 Mb)

Alessandro Pini viaggiatore in Egitto (1681 - 1683)
a cura di Rosario Pintaudi
Istituto Italiano di Cultura del Cairo 2004
(13 Mb)

Scarabei egiziani da collezioni private a cura di Sara Andrenucci
Istituto Italiano di Cultura del Cairo 2007
(57 Mb)

Rosario Pintaudi Luciano Canfora
PSI Laur. Inv. 22013: Retorica o romanzo?
Istituto Papirologico "G. Vitelli" Firenze 2010
(540 Kb)

Il Castrum Narmoutheos ritrovato a Medinet Madi nel Fayum
Missione archeologica Università di Pisa (2006-2007)
Nota del Socio Corrispondente Edda Bresciani (con Rosario Pintaudi)
Rendiconti dell'Accademia Nazionale dei Lincei
Roma 2009
(1,15 Mb)

H. Zotenberg
Chronique de Jean, évêque de Nikiou. Texte éthiopien.
Paris Imprimerie National 1883
(20 Mb)

Mia Ridge (Open Objects)

These are a few of my favourite (audience research) things

On Friday I popped into London to give a talk at the Art of Digital meetup at the Photographer's Gallery. It's a great series of events organised by Caroline Heron and Jo Healy, so go along sometime if you can. I talked about different ways of doing audience research. (And when I wrote the line 'getting to know you' it gave me an earworm and a 'lessons from musicals' theme). It was a talk of two halves. In the first, I outlined different ways of thinking about audience research, then went into a little more detail about a few of my favourite (audience research) things.

There are lots of different ways to understand the contexts and needs different audiences bring to your offerings. You probably also want to test to see if what you're making works for them and to get a sense of what they're currently doing with your websites, apps or venues. It can help to think of research methods along scales of time, distance, numbers, 'density' and intimacy. (Or you could think of it as a journey from 'somewhere out there' to 'dancing cheek to cheek'...)

'Time' refers to both how much time a method asks from the audience and how much time it takes to analyse the results. There's no getting around the fact that nearly all methods require time to plan, prepare and pilot, sorry! You can run 5 second tests that ask remote visitors a single question, or spend months embedded in a workplace shadowing people (and more time afterwards analysing the results). On the distance scale, you can work with remote testers located anywhere across the world, ask people visiting your museum to look at a few prototype screens, or physically locate yourself in someone's office for an interview or observation.

Numbers and 'density' (or the richness of communication and the resulting data) tend to be inversely linked. Analytics or log files let you gather data from millions of website or app users, one-question surveys can garner thousands of responses, you can interview dozens of people or test prototypes with 5-8 users each time. However, the conversations you'll have in a semi-structured interview are much richer than the responses you'll get to a multiple-choice questionnaire. This is partly because it's a two-way dialogue, and partly because in-person interviews convey more information, including tone of voice, physical gestures, impressions of a location and possibly even physical artefacts or demonstrations. Generally, methods that can reach millions of remote people produce lots of point data, while more intimate methods that involve spending lots of time with just a few people produce small datasets of really rich data.

So here are few of my favourite things: analytics, one-question surveys, 5 second tests, lightweight usability tests, semi-structured interviews, and on-site observations. Ultimately, the methods you use are a balance of time and distance, the richness of the data required, and whether you want to understand the requirements for, or measure the performance of a site or tool.

Analytics are great for understanding how people found you, what they're doing on your site, and how this changes over time. Analytics can help you work out which bits of a website need tweaking, and for measuring to see the impact of changes. But that only gets you so far - how do you know which trends are meaningful and which are just noise? To understand why people are doing what they do, you need other forms of research to flesh them out. 

One question surveys are a great way of finding out why people are on your site, and whether they've succeeded in achieving their goals for being there. We linked survey answers to analytics for the last Let's Get Real project so we could see how people who were there for different reasons behaved on the site, but you don't need to go that far - any information about why people are on your site is better than none! 

5 second tests and lightweight usability tests are both ways to find out how well a design works for its intended audiences. 5 second tests show people an interface for 5 seconds, then ask them what they remember about it, or where they'd click to do a particular task. They're a good way to make sure your text and design are clear. Usability tests take from a few minutes to an hour, and are usually done in person. One of my favourite lightweight tests involves grabbing a sketch, an iPad or laptop and asking people in a café or other space if they'd help by testing a site for a few minutes. You can gather lots of feedback really quickly, and report back with a prioritised list of fixes by the end of the day. 

Semi-structured interviews use the same set of questions each time to ensure some consistency between interviews, but they're flexible enough to let you delve into detail and follow any interesting diversions that arise during the conversation. Interviews and observations can be even more informative if they're done in the space where the activities you're interested in take place. 'Contextual inquiry' goes a step further by including observations of the tasks you're interested in being performed. If you can 'apprentice' yourself to someone, it's a great way to have them explain to you why things are done the way they are. However, it's obviously a lot more difficult to find someone willing and able to let you observe them in this way, it's not appropriate for every task or research question, and the data that results can be so rich and dense with information that it takes a long time to review and analyse. 

And one final titbit of wisdom from a musical - always look on the bright side of life! Any knowledge is better than none, so if you manage to get any audience research or usability testing done then you're already better off than you were before.

The Signal: Digital Preservation

18 Years of Kairos Webtexts: An interview with Douglas Eyman & Cheryl E. Ball

Cheryl E. Ball

Cheryl E. Ball, associate professor of digital publishing studies at West Virginia University, is editor of Kairos

Since 1996 the electronic journal Kairos has published a diverse range of webtexts, scholarly pieces made up of a range of media and hypermedia. The 18 years of digital journal texts are both interesting in their own right and as a collection of complex works of digital scholarship that illustrate a range of sophisticated issues for ensuring long-term access to new modes of publication. Douglas Eyman, Associate Professor of Writing and Rhetoric at George Mason University is senior editor and publisher of Kairos. Cheryl E. Ball, associate professor of digital publishing studies at West Virginia University, is editor of Kairos. In this Insights Interview, I am excited to learn about the kinds of issues that this body of work exposes for considering long-term access to born-digital modes of scholarship. [There was also a presentation on Kairos at the Digital Preservation 2014 meeting.]

Trevor: Could you describe Kairos a bit for folks who aren’t familiar with it? In particular, could you tell us a bit about what webtexts are and how the journal functions and operates?

Doug: Webtexts are texts that are designed to take advantage of the web-as-concept, web-as-medium, and web-as-platform. Webtexts should engage a range of media and modes and the design choices made by the webtext author or authors should be an integral part of the overall argument being presented. One of our goals (that we’ve met with some success I think) is to publish works that can’t be printed out — that is, we don’t accept traditional print-oriented articles and we don’t post PDFs. We publish scholarly webtexts that address theoretical, methodological or pedagogical issues which surface at the intersections of rhetoric and technology, with a strong interest in the teaching of writing and rhetoric in digital venues.


Douglas Eyman, Associate Professor of Writing and Rhetoric at George Mason University is senior editor and publisher of Kairos

(As an aside, there was a debate in 1997-98 about whether we were publishing hypertexts, which then tended to be available in proprietary formats and platforms and not freely available on the WWW or not; founding editor Mick Doherty argued that we were publishing much more than only hypertexts, so we moved from calling what we published ‘hypertexts’ to ‘webtexts’ — Mick tells that story in the 3.1 loggingon column).

Cheryl: WDS (What Doug said ;) One of the ways I explain webtexts to potential authors and administrators is that the design of a webtext should, ideally, enact authors’ scholarly arguments, so that the form and content of the work are inseparable.

Doug: The journal was started by an intrepid group of graduate students, and we’ve kept a fairly DIY approach since that first issue appeared on New Year’s day in 1996. All of our staff contribute their time and talents and help us to publish innovative work in return for professional/field recognition, so we are able to sustain a complex venture with a fairly unique economic model where the journal neither takes in nor spends any funds. We also don’t belong to any parent organization or institution, and this allows us to be flexible in terms of how the editors choose to shape what the journal is and what it does.

Cheryl: We are lucky to have a dedicated staff who are scattered across (mostly) the US: teacher-scholars who want to volunteer their time to work on the journal, and who implement the best practices of pedagogical models for writing studies into their editorial work. At any given time, we have about 25 people on staff (not counting the editorial board).

Doug: Operationally, the journal functions much like any other peer-reviewed scholarly journal: we accept submissions, review them editorially, pass on the ones that are ready for review to our editorial board, engage the authors in a revision process (depending on the results of the peer-review) and then put each submission through an extensive and rigorous copy-, design-, and code-editing process before final publication. Unlike most other journals, our focus on the importance of design and our interest in publishing a stable and sustainable archive mean that we have to add those extra layers of support for design-editing and code review: our published webtexts need to be accessible, usable and conform to web standards.

Trevor: Could you point us to a few particularly exemplary works in the journal over time for readers to help wrap their heads around what these pieces look like? They could be pieces you think are particularly novel or interesting or challenging or that exemplify trends in the journal. Ideally, you could link to it, describe it and give us a sentence or two about what you find particularly significant about it.

Cheryl: Sure! We sponsor an award every year for Best Webtext, and that’s usually where we send people to find exemplars, such as the ones Doug lists below.

Doug: From our peer-reviewed sections, we point readers to the following webtexts (the first two are especially useful for their focus on the process of webtext authoring and editing):

Cheryl: From our editorially (internally) reviewed sections, here are a few other examples:

Trevor: Given the diverse range of kinds of things people might publish in a webtext, could you tell us a bit about the kinds of requirements you have enforced upfront to try and ensure that the works the journal publishes are likely to persist into the future? For instance, any issues that might come up from embedding material from other sites, or running various kinds of database-driven works or things that might depend on external connections to APIs and such.

Doug: We tend to discourage work that is in proprietary formats (although we have published our fair share of Flash-based webtexts) and we ask our authors to conform to web standards (XHTML or HTML5 now). We think it is critical to be able to archive any and all elements of a given webtext on our server, so even in cases where we’re embedding, for instance, a YouTube video, we have our own copy of that video and its associated transcript.

One of the issues we are wrestling with at the moment is how to improve our archival processes so we don’t rely on third-party sites. We don’t have a streaming video server, so we use YouTube now, but we are looking at other options because YouTube allows large corporations to apply bogus copyright-holder notices to any video they like, regardless of whether there is any infringing content (as an example, an interview with a senior scholar in our field was flagged and taken down by a record company; there wasn’t even any background audio that could account for the notice. And since there’s a presumption of guilt, we have to go through an arduous process to get our videos reinstated.) What’s worse is when the video *isn’t* taken down, but the claimant instead throws ads on top of our authors’ works. That’s actually copyright infringement against us that is supported by YouTube itself.

Another issue is that many of the external links in works we’ve published (particularly in older webtexts) tend to migrate or disappear. We used to replace these where we can with links to archive.org (aka The Wayback Machine), but we’ve discovered that their archive is corrupted because they allow anyone to remove content from their archive without reason or notice.[1] So, despite its good intentions, it has become completely unstable as a reliable archive. But we don’t, alas, have the resources to host copies of everything that is linked to in our own archives.

Cheryl: Kairos holds the honor within rhetoric and composition of being the longest-running, and most stable, online journal, and our archival and technical policies are a major reason for that. (It should be noted that many potential authors have told us how scary those guidelines look. We are currently rewriting the guidelines to make them more approachable while balancing the need to educate authors on their necessity for scholarly knowledge-making and -preservation on the Web.)

Of course, being that this field is grounded in digital technology, not being able to use some of that technology in a webtext can be a rather large constraint. But our authors are ingenious and industrious. For example, Deborah Balzhiser et al created an HTML-based interface to their webtext that mimicked Facebook’s interface for their 2011 webtext, “The Facebook Papers.” Their self-made interface allowed them to do some rhetorical work in the webtext that Facebook itself wouldn’t have allowed. Plus, it meant we could archive the whole thing on the Kairos server in perpetuity.

Trevor: Could you give us a sense of the scope of the files that make up the issues? For instance, the total number of files, the range of file types you have, the total size of the data, and or a breakdown of the various kinds of file types (image, moving image, recorded sound, text, etc.) that exist in the run of the journal thus far?

Doug: The whole journal is currently around 20 Gb — newer issues are larger in terms of data size because there has be an increase in the use of audio and video (luckily, html and css files don’t take up a whole lot of room, even with a lot of content in them). At last count, there are 50,636 files residing in 4,545 directories (this count includes things like all the system files for WordPress installs and so on). A quick summary of primary file types:

  • HTML files:     12247
  • CSS:               1234
  • JPG files:        5581
  • PNG:               3470
  • GIF:                 7475
  • MP2/3/4:         295
  • MOV               237
  • PDF:                191

Cheryl: In fact, our presentation at Digital Preservation 2014 this year [was] partly about the various file types we have. A few years ago, we embarked on a metadata-mining project for the back issues of Kairos. Some of the fields we mined for included Dublin Core standards such as MIMEtype and DCMIType. DCMIType, for the most part, didn’t reveal too much of interest from our perspective (although I am sure librarians will see it differently!! :) but the MIMEtype search revealed both the range of filetypes we had published and how that range has changed over the journal’s 20-year history. Every webtext has at least one HTML file. Early webtexts (from 1996-2000ish) that have images generally have GIFs and, less prominent, JPEGs. But since PNGs rose to prominence (becoming an international standard in 2003), we began to see more and more of them. The same with CSS files around 2006, after web-standards groups starting enforcing their use elsewhere on the Web. As we have all this rich data about the history of webtextual design, and too many research questions to cover in our lifetimes, we’ve released the data in Dropbox (until we get our field-specific data repository, rhetoric.io, completed).

Trevor: In the 18 years that have transpired since the first issue of Kairos a lot has changed in terms of web standards and functionality. I would be curious to know if you have found any issues with how earlier works render in contemporary web browsers. If so, what is your approach to dealing with that kind of degradation over time?

Cheryl: If we find something broken, we try to fix it as soon as we can. There are lots of 404s to external links that we will never have the time or human resources to fix (anyone want to volunteer??), but if an author or reader notifies us about a problem, we will work with them to correct the glitch. One of the things we seem to fix often is repeating backgrounds. lol. “Back in the days…” when desktop monitors were tiny and resolutions were tinier, it was inconceivable that a background set to repeat at 1200 pixels would ever actually repeat. Now? Ugh.

But we do not change designs for the sake of newer aesthetics. In that respect, the design of a white-text-on-black-background from 1998 is as important a rhetorical point as the author’s words in 1998. And, just as the ideas in our scholarship grow and mature as we do, so do our designs, which have to be read in the historical context of the surrounding scholarship.

Of course, with the bettering of technology also comes our own human degradation in the form of aging and poorer eyesight. We used to mandate webtexts not be designed over 600 pixels wide, to accommodate our old branding system that ran as a 60-pixel frame down the left-hand side of all the webtexts. That would also allow for a little margin around the webtext. Now, designing for specific widths — especially ones that small — seems ludicrous (and too prescriptive), but I often find myself going into authors’ webtexts during the design-editing stage and increasing their typeface size in the CSS so that I can even read it on my laptop. There’s a balance I face, as editor, of retaining the authors’ “voice” through their design and making the webtext accessible to as many readers as possible. Honestly, I don’t think the authors even notice this change.

Trevor: I understand you recently migrated the journal from a custom platform to the Open Journal System platform. Could you tell us a bit about what motivated that move and issues that occurred in that migration?

Doug: Actually, we didn’t do that.

Cheryl: Yeah, I know it sounds like we did from our Digital Preservation 2014 abstract, and we started to migrate, but ended up not following through for technical reasons. We were hoping we could create plug-ins for OJS that would allow us to incorporate our multimedia content into its editorial workflow. But it didn’t work. (Or, at least, wasn’t possible with the $50,000 NEH Digital Humanities Start-Up Grant we had to work with.) We wanted to use OJS to help streamline and automate our editorial workflow–you know, the parts about assigning reviewers and copy-editors, etc., — and as a way to archive those processes.

I should step back here and say that Kairos has never used a CMS; everything we do, we do by hand — manually SFTPing files to the server, manually making copies of webtext folders in our kludgy way of version control, using YahooGroups (because it was the only thing going in 1998 when we needed a mail system to archive all of our collaborative editorial board discussions) for all staff and reviewer conversations, etc.–not because we like being old school, but because there were always too many significant shortcomings with any out-of-the-box systems given our outside-the-box journal. So the idea of automating, and archiving, some of these processes in a centralized database such as OJS was incredibly appealing. The problem is that OJS simply can’t handle the kinds of multimedia content we publish. And rewriting the code-base to accommodate any plug-ins that might support this work was not in the budget. (We’ve written about this failed experiment in a white paper for NEH.)

[1] Archive.org will obey robots.txt files if they ask not to be indexed. So, for instance, early versions of Kairos itself are no longer available on archive.org because such a file is on the Texas Tech server where the journal lived until 2004. We put that file there because we want Google to point to the current home of the journal, but we actually would like that history to be in the Internet Archive. You can think of this as just a glitch, but here’s the more pressing issue: if I find someone has posted a critical blog post of my work, if I ever get ahold of the domain it was originally posted, I can take it down there *and* retroactively make it unavailable on archive.org, even if it used to show up there. Even without such nefarious purpose, just the constant trade in domains and site locations means that no researcher can trust that archive when using it for history or any kind of digital scholarship.

Bill Caraher (The New Archaeology of the Mediterranean World)

A Visit to a Pallet Plant

Ok. I admit that it wasn’t exactly a pallet plant since the company no longer manufactures pallets there, but pallet plant is alliterative (in the most crass way) so more suitable than “a visit to a pallet redistribution facility.”



The guys at API Pallets here in Grand Forks were very generous with their time when even through Bret Weber and I encroached on their lunch break on a rainy Friday afternoon. They showed us around their facility and explained that pallets come in on trucks from “Canada” and are rated and then shipped out to clients throughout the region. They get a small quantity of pallets from local merchants, like the local pasta plant, but most of their inventory comes from other distribution centers. Their biggest client is a logistics firm in Casselton, ND situated on an important transportation corridor for rail and truck traffic through the northern plains.


As for the pallets themselves, we learned that API rates pallets with three grades. A1 pallets are clean, have no splinting or splitting, and have evenly spaced deck boards. One of the most interesting moments involved the guy using his fist to demonstrate the ideal width between deck boards. I’ll return to this. B1 and B2 pallets have light damage or have repairs. Irregularly spaced deck boards, the insertion of blocks to support broken stringers, or obvious splitting and splintering throughout leads to lower ratings. The difference in price between an A1 and B1/2 pallet is about $3. They do repair pallets to raise them to either A1 grade or B1/2 grade on site. 


One thing Bret and I began to think about it the way in which the size of a pallet (48 x 40 inches) has impacted life in the Bakken (and elsewhere). For example, modular housing units like the most common in the Bakken are designed to move by rail or truck. Pallets, of course, are designed to fit inside containers, semi trailers, and rail cars and move about the country carrying standardized loads. The existence of this regular unit of measure and the tendency in the Bakken to use this scale to organize human activities, whether it is life or work, provides a highly visible means of standardizing the space of human activities. 

It was heartening, then, to see the guy at the pallet plant use his fist to measure the distance between the deck boards. This gesture returned the pallet to the human scale.


The guys there also commented on the various stamps added to pallets to mark them as being used at a particular farm or factory. Since the pallet pool is an open pool, meaning that whoever possessed the pallets had the right to resell them, these stamps were meant to mark out simply one stage in the pallet’s life and to manipulate the standardized form of the object without compromising its functionality.

Finally, our reuse of pallets is important because it defies the functional expectations of these objects and reshapes them to our human existence rather than the opposite.



Archeomatica: Tecnologie per i Beni Culturali

Digital Museum Expo, a Roma dal 24 al 28 settembre 2014

digital-museum-expoAll'interno della Mostra Le Chiavi di Roma - La città di Agusto aperta dal 24 settembre presso il Museo dei Fori Imperiali- Mercati di Traiano si svolgerà Digital Museum Expo, una mostra itinerante, dedicata alla soluzioni tecnologiche più innovative create per i musei e pensata per professionisti e operatori del settore. "L'Expo consentirà a professionisti e visitatori di quattro paesi - Italia, Olanda, Bosnia ed Egitto - di vivere il passato, nei suoi edifici, nei suoi paesaggi e nella sua quotidianità, attraverso diversi tipi di musei virtuali". 

Master in Nanotecnologie e Nanomateriali per i Beni Culturali

master-nanotecnologie-unipaRiparte la nuova edizione del Master Universitario di II livello “Ricercatore Esperto di Nanotecnologie e Nanomateriali per i Beni Culturali” presso l’Università degli Studi di Palermo, con sede presso il Dipartimento di Energia, Ingegneria dell’Informazione e Modelli Matematici.

Master in Valorizzazione e Comunicazione Museale del Patrimonio Culturale Indoor e Outdoor

master-valorizzazione-unipaSono aperti i termini per la presentazione della candidatura alla prima edizione del Master Universitario di I livello “Esperto in Valorizzazione e Comunicazione Museale del Patrimonio Culturale Indoor e Outdoor” dell’Università degli Studi di Palermo, con sede presso il Polo Didattico di Agrigento.

Charles Ellwood Jones (AWOL: The Ancient World Online)

Open Access Journal: Clara Rhodos. Studi e materiali pubblicati a cura dell' Istituto Storico-Archeologico di Rodi

Clara Rhodos. Studi e materiali pubblicati a cura dell' Istituto Storico-Archeologico di Rodi
Στη σειρά Clara Rhodos, που αποτελείται από δέκα τόμους και εκδόθηκε από το 1928 έως το 1941, παρουσιάζονται οι έρευνες και οι ανασκαφές στα Δωδεκάνησα, κυρίως στη Ρόδο, την Κω, τη Χάλκη και τη Νίσυρο, κατά τη διάρκεια της Ιταλοκρατίας. Η σειρά αποτελεί έκδοση του ινστιτούτου FERT, που συστήθηκε από τους Ιταλούς αρχαιολόγους το 1927. Μετά την ενσωμάτωση της Δωδεκανήσου στην Ελλάδα το 1948, οι δικαιοδοσίες του FERT μεταβιβάστηκαν στην Ελληνική Αρχαιολογική Υπηρεσία, και συγκεκριμένα στο Αρχαιολογικό και Ιστορικό Ίδρυμα Ρόδου, το οποίο το 2003 μετονομάστηκε σε Αρχαιολογικό Ινστιτούτο Αιγαιακών Σπουδών.

Clara Rhodos I: A. Maiuri – G. Jacopich, Rapporto generale sul servizio archeologico a Rodi e nelle isole dipendenti dall’anno 1912 all’anno 1927, Bergamo 1928
Clara Rhodos II: A. Maiuri, Monumenti di scultura del Museo archeologico di Rodi I, Bergamo 1932
Clara Rhodos III: G. Jacopi, Scavi nella necropoli di Jalisso, 1924-1928, Bergamo 1929
Clara Rhodos IV: G. Jacopi, Esplorazione archeologica di Camiro I, Bergamo 1931
Clara Rhodos IX: 1. L. Laurenzi, Monumenti di scultura del Museo Archeologico di Rodi – IV; e dell’ Antiquarium di Coo - II. 2. E. Paribeni, Due vasi del Museo Archeologico di Rodi. 3. G. Levi Della Vida, Una bilingue Greco-Nabatea a Coo. 4. M Segre, La legge ateniese sull’ unificazione della Moneta. 5. M Segre, Iscrizioni di Licia. 6. S. Accame, Un nuovo decreto di Lindo del V Sec. A.C., Bergamo 1938.
Clara Rhodos V. 1: G. Jacopi, Monumenti di scultura del Museo archeologico di Rodi II, Bergamo 1931
Clara Rhodos V. 2: G. Jacopi, Monumenti di scultura del Museo archeologico di Rodi III, Bergamo 1932
Clara Rhodos VIII: 1. L. Laurenzi, Necropoli ialisie (Scavi dell’ anno 1934). 2. P. E. Arias, “Pelike” con amazzonomachia dell’ “Antiquarium” di Coo. 3. M Segre, Dedica votiva dell’ equipaggio di una nave rodia. 4. P. Lojacono, La chiesa conventuale di S. Giovanni dei Cavalieri in Rodi. 5. P. Lojacono, Il Palazzo del Gran Maestro in Rodi , Bergamo 1936.
Clara Rhodos VI-VII: G. Jacopi, Esplorazione archeologica di Camiro II, Necropoli, Acropoli, Bergamo 1932-3
Clara Rhodos X: 1. L. Laurenzi, Ritratto di un principe ellenistico. 2. L. Laurenzi, Statuetta acefala di Cleobulo Lindio. 3. L. Laurenzi, Iscrizioni dell’ Asclepieo di Coo. 4. G Monaco, Scavi nella zona micenea di Jaliso (1935-1936). 5. M. C. De Azevedo, Una oinochoe della necropoli di Jaliso. 6. A. Degrassi, Iscrizioni latine inedite di Coo, Bergamo 1941.

Perseus Digital Library Updates

The Digital Loeb Classical Library — a view from Europe

The full text of “the Digital Loeb Classical Library — a view from Europe,” is available here.

Summary: The Digital Loeb Classical Library has gone live and many students of Greek and Latin are testing it. “The Digital Loeb Classical Library — a view from Europe” considers some of the issues that the new DLCL raises. First, there is the general question of how long the community will support new, proprietary systems, each with their own environment, none releasing their data under an open license, and all incompatible, for all practical purposes, with each other. More generally, this essay explores three issues that the DLCL raises in a European context: (1) the problem of depending upon, and actively supporting, commercial sources of Greek and Latin, especially in Europe, where tax dollars support virtually all professional intellectual life; (2) the problem of using English if we want to reach secondary schools — only about 5% (probably less) of those who study Greek and Latin do so in English; (3) the problem of encouraging students to produce annotations that are keyed to the idiosyncratic page breaks that appear only in the Loeb editions (and thus of implicitly making the Loeb a new standard for citation). Overall, the DLCL is yet another publisher’s portal, solid in implementation and not challenging to use, but dependent models from print, such as monopoly control of content to extract subscriptions and the print page as dominant metaphor.

The study of Greek and Latin needs to build upon what we already can see is possible in a digital space and to move forward if we are to offer a truly competitive discipline to new generations of students and to the general public. Some of the issues and opportunities before us are raised in the call for papers in Greek and Latin in an Age of Open Data, but there are many fora in which to discuss how to move forward. It is time for students of Greek and Latin to get on with it and accelerate the transition to a more open, sustainable and dynamic environment by which to advance the role of Greco-Roman culture in the intellectual life of society.

Archeomatica: Tecnologie per i Beni Culturali

Il CNR inaugura nuovo centro di Umanistica Digitale

principi-scienza-nuovaL'Istituto per la storia del pensiero filosofico e scientifico moderno (Ispf) del CNR inaugura il nuovo Centro di umanistica digitale, che raccoglie in un'unica struttura le attività di e-publishing, elaborazione digitale e sviluppo in rete svolte dall'Istituto.

Gli investigatori dell’arte e dell’archeologia, ciclo di incontri a Torino

investigatori-arte-archeologiaA Torino presso l'Aula Magna del Palazzo del Rettorato è in programma un ciclo di incontri dal titolo "Gli investigatori dell'arte e dell'archeologia - Un'indagine per svelare i segreti della storia e degli artisti".
L'evento è realizzato dalla Fondazione Fondo Ricerca e Talenti, prima fondazione universitaria piemontese, attraverso il finanziamento e il sostegno a studenti e ricercatori non strutturati nelle iniziative di comunicazione e divulgazione scientifica.

September 21, 2014

Alliance of Digital Humanities Organizations

Call for Hosting Conferences: The LLOH Prize (2015)

The late EADH chair (2010-2012), Lisa Lena Opas-Hänninen, attended conferences not only in the digital humanities but also in other disciplines. She was invariably interested in and encouraging of young scholars in particular, and she also spent a great deal of time in informal conversation with a wide range of colleagues. The Lisa Lena Opas-Hänninen Young Scholar Prize was established in 2013 to honour her memory. The LLOH Prize is awarded to early-career scholars, that is, students, graduate students, or postdoctoral researchers at different conferences each year.

Any individual member of any of the ADHO constituent organizations may submit proposals to the Awards Committee chair for conferences taking place in the following year. This call is specific to conferences in 2015. Individual members are encouraged (but not required) to seek the endorsement of a constituent organization.

Proposals should clarify why the conference is likely to include contributions to digital humanities. Eligible conferences may include sub-disciplines in which digital techniques have not been achieved widespread acceptance. Special consideration should be given to proposals that encourage a diverse pool of applicants, addressing matters of cultural, linguistic, ethnic, and gender diversity. Proposals may ask for funding for one or two prizes and, additionally, a reception at which the prizes are awarded. At the reception, the history and sponsorship of the prizes should be explained.

The proposal should identify the conferences (dates, venue, web site), the sort of contribution which is to be recognized (paper, poster, etc.), how the winner or winners are to be selected, who will present the award and explain its background, and the total budget. The budget may not exceed €1500 in total if two prizes are to be awarded or €750 if one prize is to be awarded. The budget includes €500 for each winner to defray the costs of travel, lodging and conference registration and up to € 250 (one prize) or € 500 (two prizes) for a reception. The awards committee selects the single best proposal for awarding the prize(s) at a given conference. The committee will give preference to proposals from constituent organizations that have not recently been awarded a LLOH Prize.
The 2014 LLOH Prize was awarded at the Methods in Dialectology XV conference: http://methodsxv.webhosting.rug.nl

More information about the prize can be found at the webpage of the ADHO Awards Committee: http://adho.org/awards/lisa-lena-opas-hänninen-young-scholar-prize

Please feel free to write to the committee chair with any enquires:oyvind.eide@uni-passau.de
Deadline: 10 October 2014

Jason Heppler (History in the Digital)

CShapes: Historical Country Boundaries →

Thanks to my colleague (and co-host) Elijah Meeks for pointing me to a project by Nils Weidmann, who has put together CShapes – an R package and GIS shapefile of country boundaries and capitals between 1946 and 2008.

Tom Gewecke (Multilingual Mac)

3rd Party Keyboards for iOS 8

I have installed one of the 3rd party custom keyboards which have become possible with iOS 8 -- Minuum.  This article mentions some others which should be available. If readers come across any which are useful for languages other than English, let me know.   So far: +Sangam for Kannada, Malayalam, Tamil, Telugu, with suggestions, autocorrect, and next-word prediction. +Georgian Keyboard for

New Keyboards and System Languages in iOS 8

Apple has added keyboards for English (India),  Bengali, Filipino, Marathi, Slovenian,  and Urdu in the new version of iOS released 9/17/14. The settings formerly available for us additional hardware-type keyboard layouts with iOS devices are no longer available.   For example, under English these were:  us, dvorak, colemak, us international-pc, us extended, british, french, german, spanish-iso

Charles Ellwood Jones (AWOL: The Ancient World Online)

Open Access Journal: Hortulus: The Online Graduate Journal of Medieval Studies

 [First posted in AWOL 13 February 2010. Updated 20 September 2014]

Hortulus: The Online Graduate Journal of Medieval Studies

Hortulus: The Online Graduate Journal of Medieval Studies is a multidisciplinary refereed postgraduate journal devoted to the literatures, cultures, and ideas of the medieval world. Published electronically twice a year, its mission is to present a forum in which graduate students from around the globe may share their work.
Hortulus has an open submission policy, so submissions are accepted throughout the year. Graduate students are welcome to submit previously unpublished articles that challenge our readers to look at the Middle Ages from a variety of perspectives by engaging in new theories and interdisciplinary research. All articles should be submitted via email; submission guidelines can be found here.
We publish a themed issue each spring, and a general issue each autumn. Calls for papers for each issue can be found on the main page of the website. There are also various position openings in Hortulus throughout the year so watch our social media outlets for information.

September 20, 2014

Roger Pearse (Thoughts on Antiquity, Patristics, putting things online, and more)

H.V.Morton on Gregory the Great and the deserted Palatine

This morning I read these words:

I descended the noble steps [from the church of St Gregory on the Caelian hill].  Every day of his life, I reflected, St Gregory while in Rome, and before he went to live at the Lateran Palace as Pope, must have seen the Colosseum; a few paces would take him past the Circus Maximus, already weed-grown and deserted, above which rose the imperial palaces, unoccupied for centuries but still capable of housing a stray Exarch from Ravenna.  The last time they received an emperor was twenty-five years after Gregory’s death, in 629, when Heraclius visited Rome and was invested with the diadem in the throne room on the Palatine.  What a ghostly moment that must have been; for the middle ages were ready to be born.

These words are from H.V. Morton, A Traveller in Rome, published in 1957.[1]

I know nothing of that visit to Rome by Heraclius, I must say, but that portrait in words moves me to find out.  Which, in a way, says that the book is doing its job!

I’m reading the book because it’s a gentle, restful book to read.  For those unfamiliar with them, Morton’s books are a mixture of personal observation and material rewritten from books such as the popularisations of Lanciani, and are perfectly targeted at the educated but non-specialist reader.   They are uneven; but the best are very good indeed.

But it is a wistful experience, reading Morton’s Through Lands of the Bible, where he travels through Palestine and Iraq in the 1930’s.  It is a portrait of a peaceful, quiet world.  Under the rule of the honest, efficient colonial powers, the region knew the first enlightened, progressive, civilised government that it had ever had.

How sad that it was also the last.  I am by no means anti-American, but America has been the dominant power in the region since WW2, and the policies pursued by its ruling class, often well-intentioned but invariably counter-productive, have condemned its inhabitants to ceaseless, pointless strife, poverty and misery.

Let us take up the books written in better days, and dream of a better world than our own.

UPDATE: Later in the book Morton refers to a visit by Constans II to stay in the Palatine, some 20 years later than Heraclius.  I have a feeling that his books were serialized, which may explain how episodic they sometimes can be; and mistakes like this!

  1. [1] By Methuen; In the 1984 paperback reprint this is p.208

Managing the photocopies!

Alright.  Confess.  Is there anyone who does NOT have a large pile of photocopies of articles, book excerpts, and even complete books, somewhere in their house or study area?  No?  I thought not. Dratted nuisance, aren’t they?

Clearing the decks!

Clearing the decks!

Years ago I used to file them, in hanging folders in filing cabinets.  This week I have been emptying a drawer of such copies.  Most of these were on A3 paper, so very hard to scan; but I simply drew a trimmer down the middle and scanned them in anyway.  And then, most importantly, I threw away the paper.  And the hangers.

At this moment I am going through a pile of off-prints, and guillotining the spines and shoving them through my document scanner.  They scan beautifully.  And … I am throwing the paper away.  The PDFs that I get from the scanner I make searchable, and then, for once, I can use them.

It’s a bit nostalgic, in a way.  I’m finding papers that I ordered in 2001, via my local library.  This was before PDFs existed.  The library charged a substantial sum per paper, and it arrived in weeks, not days.  In those days it was the only available method to obtain a copy of anything.  Now … we have electronic methods.  It’s not so long ago, and yet it’s a different world.

Most of the papers relate to my interest in Tertullian.  I’m scanning in a bunch of copies of the Chronica Tertullianea et Cyprianea as I type – the key bibliography for Latin ante-Nicene patristics.  They will be far easier to search in PDF form!

Also found were a bunch of papers by Canadian academic James Carley, about the English antiquary John Leland.  Leland lived in the times of Henry VIII, when the monasteries were being suppressed, and inspected their libraries.  Many volumes from English monasteries went overseas; most were destroyed.  A post on his work might not go amiss, perhaps.

Meanwhile, I need to scan some more stuff and declutter!  It’s a good task for a rainy day.

Have you purged your filing cabinet lately?

Charles Ellwood Jones (AWOL: The Ancient World Online)

Perseus: Announcing the Arethusa Annotation Framework

Announcing the Arethusa Annotation Framework
Developers Gernot Höflechner, Robert Lichtensteiner and Christof Sirk, in collaboration with the Perseus Digital Library at Tufts (via the Libraries and the Transformation of the Humanities and Perseids projects) and the University of Leipzig’s Open Philology Project, have released Arethusa, a framework for linguistic annotation and curation. Arethusa was inspired by and extends the goals of the Alpheios Project, to provide a highly configurable, language-independent, extensible infrastructure for close-reading, annotation, curation and exploration of open-access digitized texts. While the initial release highlights support for morpho-syntactic annotation, Arethusa is designed to allow users to switch seamlessly between a variety of annotation and close-reading activities, facilitating the creation of sharable, reusable linguistic data in collaborative research and pedagogical environments.
Arethusa is built on the angular.js javascript web application framework and provides a back-end independent infrastructure for accessing texts, annotations and linguistic services from a variety of sources. Extensibility is a guiding design goal — Arethusa includes tools for automatic generation of skeleton code for new features as plugins; detailed development guides are also currently in progress. We hope others will be able to reuse and build upon the platform to add support for other annotation types, languages and back-end repositories and workflow engines.
Arethusa is already deployed as a component of the Perseids platform, where it provides an annotation interface for morpho-syntactic analyses and will soon also act as a broker between the Perseids back-end (the Son of SUDA Online application) and various other front-end annotating and editing activities, including translation alignments, entity identification and text editing.
Screencasts are available that show how the Arethusa application can be used for syntactic diagram (treebank) and morphological analysis annotations on Perseids. Additional demos and slides will be made available soon which highlight additional features along with the architecture and design.
This project has been made possible in part by the Institute of Museum and Library Services (Award LG0611032611), the Andrew W. Mellon Foundation and the European Social Fund. We also are indebted to Robert Gorman and Vanessa Gorman of the University of Nebrask and Giuseppe G. A. Celano of the University of Leipzig for their invaluable contributions to the design and testing of the platform.

Perseus Digital Library Updates

The Digital Loeb Classical Library, Open Scholarship, and a Global Society

This piece was first published in February 2014 as an open Google doc on the Digital Loeb Classical Library, Open Scholarship, and a Global Society. Another piece is in preparation and will appear on the blog for the Open Philology Project at Leipzig.

September 19, 2014

Shawn Graham (Electric Archaeology)

Open notebooks part III

Do my bidding my robots!

Do my bidding my robots!

I’ve sussed the Scrivener syncing issue by moving the process of converting out of the syncing folder (remember, not the actual project folder, but the ‘sync to external folder’). I then have created four automator applications to push my stuff to github in lovely markdown. Another thing I’ve learned today: when writing in Scrivener, just keep your formatting simple. Don’t use markdown syntax within Scrivener or your stuff on github will end up looking like this \##second-heading. I mean, it’s still legible, but not as legible as we’d like.

So – I have four robots. I write in Scrivener, keep my notes, close the session, whereupon it syncs rtf to the ‘external folder’ (in this case, my dropbox folder for this purpose; again, not the actual scrivener project folder).

  1. I hit robot 1 on my desktop. Right now, this is called ‘abm-project-move-to-conversion-folder’. When I have a new project, I just open this application in Automator, and change the source directory to that project’s Scrivener external syncing folder. It grabs everything out of that folder, and copies it into a ‘conversion-folder’ that lives on my machine.
  2. I hit robot 2, ‘convert-rtf-to-md’, which opens ‘conversion-folder’ and turns everything it finds into markdown. The conversion scripts live in the ‘conversion-folder'; the things to be converted live in a subfolder, conversion-folder/draft
  3. I hit robot 3, ‘push-converted-files-to-github-repo’. This grabs just the markdown files, and copies them into my local github repository for the project. When I have a new project, I’d have to change this application to point to the new folder. This also overwrites anything with the same file name.
  4. I hit robot 4, ‘clean-conversion-folder’ which moves everything (rtfs, mds,) to the trash. This is necessary because if not, then I can end up with duplicates of files I haven’t actually modified getting through my pipeline onto my github page. (If you look at some of my experiments on github, you’ll see the same card a number of times with 1…2…3…4 versions).

Maybe it’s possible to create a meta-automator that strings those four robots into 1. I’ll try that someday.
Ok, so of course, I tried stringing them just now. And it didn’t work. So I put that automator into the trash -
and now my original four robots give me errors, ‘the application …. can’t be opened. -1712′. I found the solution here (basically, go to spotlight, type in activity, then locate the application on the list and quit it).

Here are my automators:

Robot 1

Robot 1

Robot 2

Robot 2

Robot 3

Robot 3

Robot 4

Robot 4


I think I love you.


Duke Collaboratory for Classics Computing

What’s in a placename?

New York. Paris. London. Saying these names likely evokes a sense of place in many people. Maybe what’s evoked is the knowledge of a place on a map, a sense of the culture there, or the memory of a trip. But what do these toponyms we can so casually reference actually mean? Do I mean “London” or “London”? “New York” or “New York”? If I were in my home state of Kentucky, I might well mean London and Paris. This gets even trickier when you consider the shifting cultural contexts (and even geography) introduced by thinking about historical placenames over time. In what sense is ancient Lutetia modern Paris?

The purpose here is not to retread already well-trod philosophical grounds, but rather to highlight the sorts of very real problems that can confront us when trying to align multiple datasets containing placenames. This is important for us at DC3 because we want to align multiple epigraphic databases containing a variety of forms of placenames; moreover, aligning these placenames to other databases which include actual geospatial information will allow querying and visualization of the data in a way which is not easily possible now. One can imagine looking for inscriptions found within some radius of an ancient or modern city, or creating a map showing the geographical distribution of all inscriptions in the database, or a visualization which illustrates the relationships between the findspot of an inscription and the placenames mentioned in its text, and so on.

One component of this involves aligning names in Pleiades and GeoNames, allowing us to get a “free” mapping to the other resource wherever we have a relationship to only one, and greatly expanding our graph of knowledge. The machine-automated process for this, known as “Pleiades+”, simply uses a combination of string-matching and geospatial filtering to try to find likely matches between both resources. But many of these matches may be erroneous under various criteria – multiple similarly-named places within a certain radius of one another may all be matched to one another, for example, or a city may be matched to both a city and an administrative region.

Similar to the problem Hugh discussed in the previous post, you can come up with certain rules for some of these cases, but others require a human to make the decision. As a result, we’ve adapted the excellent gazComp tool developed at Perseus to work through the list of Pleiades+ candidate matches and allow quick visualization and voting for each match. The process of developing and using the tool on real data has also turned up various kinds of ambiguities like those discussed before: what, exactly, do we mean by a “match”? For example, publications may occasionally use the name of the nearest modern city interchangeably with the actual archaeological site name, and sometimes GeoNames may have records for both the city and the site, and sometimes not. Any solution causes a certain amount of anxiety, as what’s “right” may depend on a variety of contexts – context of the place, context of the placename mention, context of how these “matches” will be used, and so on. There’s not one perfect answer for all cases. What we hope to accomplish is not perfection, but to move pragmatically toward improvement.

In that spirit, we’ve placed a publicly accessible instance of the Pleiades+ gazComp voting tool online. Currently, it requires sign-in with a Google account for vote attribution. Eventually, we will incorporate the results of these votes into the Pleiades+ output, so that anyone can use them. Additionally, if you start adding places to GeoNames where you currently come across an erroneous match in the voting tool (ancient ruins clearly visible on the satellite imagery with no marker in GeoNames, for example), they will eventually get picked up by the automated Pleiades+ process and be fed as candidate matches into the voting pool. The hope is that this process will also allow us to broaden the pool of Pleiades+ match candidates without making the data meaningless; once we have good vote coverage for this initial set, we can start to add in matches such as those from substring rather than exact string matching, which doubles the number of candidates.

We need your votes! If you have any questions or run into any problems, feel free to leave a comment on this post, drop us a line, or use the gazComp issue tracker on GitHub.

The post What’s in a placename? appeared first on Duke Collaboratory for Classics Computing (DC3).

Shawn Graham (Electric Archaeology)

An Open Research Notebook Workflow with Scrivener and Github Part 2: Now With Dillinger.io!

A couple of updates:

First item

The four scripts that sparkygetsthegirl crafted allow him to

1. write in Scrivener,

2. sync to a Dropbox folder,

3. Convert to md,

4. then open those md files on an android table to write/edit/add

5. and then reconvert to rtf for syncing back into Scrivener.

Screen Shot 2014-09-19 at 2.24.27 PMI wondered to myself, what about some of the online markdown editors? Dillinger.io can scan Dropbox for md files. So, I went to Dillinger.io, linked it to my dropbox, scanned for md files, and lo! I found my project notes. So if the syncing folder is shared with other users, they can edit the notecards via Dillinger. Cool, eh? Not everyone has a native app for editing, so they can just point their browser’s device to the website. I’m sure there are more options out there.

Second Item

I was getting syncing errors because I wasn’t flipping the md back to rtf.

But, one caveat: when I went to run the md to rtf script, to get my changes back into Scrivener (and then sync), things seemed to go very wonky indeed. One card was now blank, the others were all Scrivener’s markup but Scrivener wasn’t recognizing it.

So I think the problem is me doing things out of order. I continue to play.

Third Item

I automated running of the conversion scripts. You can see my automator set up in the screenshot below. Again, I saved it as an application on my desktop. First step is to grab the right folder. Second, to open the terminal, input the commands, then close the terminal.

Screen Shot 2014-09-19 at 2.36.03 PM


I was asked why on earth would I want to share my research notes? Many many reasons – see Caleb McDaniel’s post, for instance – but one other feature is that, because I’m doing this on Github, a person could fork (copy) my entire research archive. They could then use it to build upon. Github keeps track of who forks what, so forking becomes a kind of mass citation and breadcrumb trail showing who had an idea first. Moreover, github code (or in this case, my research archive) can be archived on figshare too, thus giving it a unique DOI *and* proper digital archiving in multiple locations. Kinda neat, eh?

Roger Pearse (Thoughts on Antiquity, Patristics, putting things online, and more)

The Green collection founder and his bible museum

A commenter draws my attention to a most interesting article in the Washington Post:

Hobby Lobby’s Steve Green has big plans for his Bible museum in Washington

The Bible museum taking shape in the building over the Federal Center SW Metro station started out in a very different location and with a very different message.

The project was planned for Texas in the late 2000s. Green told reporters he intended to put it in Dallas because so many church-going Christians live there. The mission statement on its initial nonprofit filing documents was clear: to “bring to life the living word of God … to inspire confidence in the absolute authority” of the Bible’s words. Green wanted to hand out Bible tracts to visitors, who would exit the museum singing “Amazing Grace,” said Scott Carroll, a specialist in biblical manuscripts who advised Green’s Bible-collecting and museum efforts from their start in 2009 through 2012.

Today, the message has undergone a drastic revision. The Web site for Green’s traveling Bible exhibit, “Passages,” says the museum “will be dedicated to a scholarly approach to the history, narrative and impact of the Bible.” Green says he now supports a museum approach that is nonsectarian and non-proselytizing.

The skeptics have another reason to embrace this new museum. Substantive funding for Bible scholarship and exploration is scarce. At a time when polls show that Americans are increasingly ignorant about the Bible and religion, the Greens are happily pouring hundreds of millions into preserving, researching and taking public what’s called the Book of Books.

… things turned sharply in 2009, as Green worked with Carroll to start building his collection.

The economy crashed, and several private donors and major institutions started dumping assets. Green went on a three-year buying spree. “We were looking at good buying. We thought: ‘This is worth much more than they’re asking. Let’s buy it.’ ”

Green bought Dead Sea Scroll fragments, Babe Ruth’s Bible, the Codex Climaci Rescriptus — a bundle of manuscripts from the 5th to the 9th centuries that includes the phrase that Christianity teaches Jesus uttered on the cross: “Eli, Eli, lema sabachthani” (“My God, my God, why have you forsaken me?”). Green owns the world’s largest collection of Torah scrolls.

As word spread of the Green Collection, some scholars panted at the possibility that items long held in completely private collections might be available for study.

It’s an interesting article on an interesting subject.

In the ruling class of the USA there seems to be a terrifying degree of bigotry towards their own backwoods Christianity, from which Green has emerged.  I have already seen vituperation from scholars which I can only characterise as motivated by the idea that “this is our space” and based purely on religious animosity.  But it would be a great pity if this antipathy was allowed to derail a project that should be of universal benefit.

Juan Garcés (Digitised Manuscripts Blog)

Virgil's Countryside

On September 21, 19 BC, Publius Vergilius Maro died of a fever at Brundisium. Though Virgil's birthday, on the Ides of October, is more traditionally the day on which the poet is remembered, we at Medieval Manuscripts can never pass up the opportunity to talk about the man from Mantua....

Paul Dilley (Hieroi Logoi)

A Repertoire of Byzantine “Beneficial Tales”


This website of “Narrations Useful to the Soul,” a genre that flourished in Late Antique monasticism, has quietly been online since at least 2001, when it is cited by John Wortley, its author, in Dumbarton Oaks Papers 55. Is this one of the first online resources to be explicitly cited in an article on Late Antiquity? In any case, it is still available at the author’s personal webpage courtesy of the University of Manitoba, where he is an emeritus professor. The Repertoire consists of over 900 précis of “spiritual tales,” culled from a wide variety of Late Antique sources, selected primarily according to criterion of narrative form. The tales are ordered arbitrarily according to “W” numbers, and frequently cross-reference the entries of François Halkin in Bibliotheca Hagiographica Graeca (1957) and Novum Auctarium Bibliothecae Hagiographicae Graecae (1984), which one must consult for manuscript descriptions of unpublished texts. The great research benefit of the site (is it good for the soul?) is the ability to search these texts for content.


Perseus Digital Library Updates

Announcing the Arethusa Annotation Framework

Developers Gernot Höflechner, Robert Lichtensteiner and Christof Sirk, in collaboration with the Perseus Digital Library at Tufts (via the Libraries and the Transformation of the Humanities and Perseids projects) and the University of Leipzig’s Open Philology Project, have released Arethusa, a framework for linguistic annotation and curation. Arethusa was inspired by and extends the goals of the Alpheios Project, to provide a highly configurable, language-independent, extensible infrastructure for close-reading, annotation, curation and exploration of open-access digitized texts. While the initial release highlights support for morpho-syntactic annotation, Arethusa is designed to allow users to switch seamlessly between a variety of annotation and close-reading activities, facilitating the creation of sharable, reusable linguistic data in collaborative research and pedagogical environments.


Arethusa is built on the angular.js javascript web application framework and provides a back-end independent infrastructure for accessing texts, annotations and linguistic services from a variety of sources. Extensibility is a guiding design goal — Arethusa includes tools for automatic generation of skeleton code for new features as plugins; detailed development guides are also currently in progress. We hope others will be able to reuse and build upon the platform to add support for other annotation types, languages and back-end repositories and workflow engines.

Arethusa is already deployed as a component of the Perseids platform, where it provides an annotation interface for morpho-syntactic analyses and will soon also act as a broker between the Perseids back-end (the Son of SUDA Online application) and various other front-end annotating and editing activities, including translation alignments, entity identification and text editing.

Screencasts are available that show how the Arethusa application can be used for syntactic diagram (treebank) and morphological analysis annotations on Perseids. Additional demos and slides will be made available soon which highlight additional features along with the architecture and design.

This project has been made possible in part by the Institute of Museum and Library Services (Award LG0611032611), the Andrew W. Mellon Foundation and the European Social Fund. We also are indebted to Robert Gorman and Vanessa Gorman of the University of Nebrask and Giuseppe G. A. Celano of the University of Leipzig for their invaluable contributions to the design and testing of the platform.

Charles Ellwood Jones (AWOL: The Ancient World Online)

Digital Humanities and the Ancient World

Digital Humanities and the Ancient World
Biblical Archaeology Society Staff   •  08/13/2014
What would happen if the Pope’s library were accidentally burnt? How can we reconstruct and visualize ancient and medieval pilgrimage routes? Technology is changing the way we study and preserve texts and artifacts. In a series of web-exclusive articles written by scholars engaged in the Digital Humanities, learn how this growing field of study is helping to analyze textual and archaeological data—and how you can help.


Digital Humanities: An Introduction

Jewish-Iraqi-manuscriptsWhat if the Dead Sea Scrolls were damaged? What if the Pope’s library burned down? In “Digital Humanities: How Everyone Can Get a Library Card to the World’s Most Exclusive Collections Online,” George Washington University associate professor of history Diane H. Cline explores the research opportunities and potential impact of Digital Humanities projects. This new field not only preserves publications, it extends access to the humanities to anyone with Internet access.
Read “Digital Humanities: How Everyone Can Get a Library Card to the World’s Most Exclusive Collections Online” by Diane H. Cline >>


Mapping Technologies

pleiades-stoa-orgWant to follow a fourth-century pilgrim itinerary from Bordeaux via Constantinople to the Holy Land? Experiment with ancient travel times and their costs over land, sea and sand in the Roman Empire? University of Iowa assistant professor of classics Sarah E. Bond explains in “Map Quests: Geography, Digital Humanities and the Ancient World” how the Digital Humanities offers opportunities to explore, interact with and contribute to maps of the ancient world.
Read “Map Quests: Geography, Digital Humanities and the Ancient World” by Sarah E. Bond >>


Open Access to Digital Data

Open-Context-1Interested in exploring the results of archaeology projects directly from the researchers? Cutting-edge technology is helping archaeologists generate a tremendous amount of digital data each year. At the same time, the scientific community increasingly expects direct access to the data. In “Open Context: Making the Most of Archaeological Data,” Alexandria Archive Institute cofounders Sarah Whitcher Kansa and Eric Kansa describe Open Context, an open access, peer-reviewed data publishing service that has published over one million digital resources, from archaeological survey data to excavation documentation and artifact analyses.
Read “Open Context: Making the Most of Archaeological Data” by Sarah Whitcher Kansa and Eric Kansa >>


Making University Collections Accessible to All

CNERS-tabletMany university departments across the world have shelves and storerooms full of books, artifacts and research collected over several decades. What do you do when the “skeletons in your closet” are a box of 2,000-year-old artifacts? That was the question facing the University of British Columbia’s Department of Classical, Near Eastern, and Religious Studies. In “From Stone to Screen: Bringing 21st-Century Access to Ancient Artifacts,” members of the From Stone to Screen graduate student project at UBC discuss their ongoing efforts to create digital archives of their department’s artifact collection—making these fascinating objects accessible to a global audience online.
Read “From Stone to Screen: Bringing 21st-Century Access to Ancient Artifacts” >>

Open Access Journal: Vjesnik Arheološkog muzeja u Zagrebu - Journal of the Zagreb Archaeological Museum

[First posted in AWOL 10 August 2010. Updated 19 September 2014]

Vjesnik Arheološkog muzeja u Zagrebu - Journal of the Zagreb Archaeological Museum
ISSN: 0350-7165
Vjesnik Arheološkog muzeja u Zagrebu objavljuje znanstvene i stručne radove koji obrađuju širok raspon tema iz područja pretpovijesne, antičke i srednjovjekovne arheologije te arheologiji srodnih i komplementarnih znanstvenih grana. Serije: 1. tzv. "nulta serija" pod imenom Viestnik Narodnoga zamaljskoga muzeja u Zagrebu (1870-1876.); 2. Viestnik Hrvatskoga arkeologičkoga družtva (1879-1892.); 3. Vjesnik Hrvatskoga arheološkoga društva, nova serija (1895-1941/1942.); 4. Vjesnik Arheološkog muzeja u Zagrebu (1958-).

The Journal of the Zagreb Archaeological Museum publishes scientific and professional papers which cover broad range of topics in prehistorical, classical and medieval archaeology. Journal series: 1. Viestnik Narodnoga zamaljskoga muzeja u Zagrebu (1870-1876.); 2. Viestnik Hrvatskoga arkeologičkoga družtva (1879-1892.); 3. Vjesnik Hrvatskoga arheološkoga društva, nova serija (1895-1941/1942.); 4. Vjesnik Arheološkog muzeja u Zagrebu (1958-).

  Vol. 46   No. 1
  Vol. 45   No. 1
  Vol. 44   No. 1
  Vol. 43   No. 1
  Vol. 42   No. 1
  Vol. 41   No. 1
  Vol. 40   No. 1
  Vol. 39   No. 1
  Vol. 38   No. 1
  Vol. 37   No. 1
  Vol. 36   No. 1
  Vol. 35   No. 1
  Vol. 34   No. 1
  Vol. 32-33   No. 1
  Vol. 30-31   No. 1
  Vol. 28-29   No. 1
  Vol. 26-27   No. 1
  Vol. 24-25   No. 1
  Vol. 23   No. 1
  Vol. 22   No. 1
  Vol. 21   No. 1
  Vol. 20   No. 1
  Vol. 19   No. 1
  Vol. 18   No. 1
  Vol. 16-17   No. 1
  Vol. 15   No. 1
  Vol. 14   No. 1
  Vol. 12-13   No. 1
  Vol. 10-11   No. 1
  Vol. 9   No. 1
  Vol. 8   No. 1
  Vol. 6-7   No. 1
  Vol. 5   No. 1
  Vol. 4   No. 1
  Vol. 3   No. 1
  Vol. 2   No. 1
  Vol. 1   No. 1
  Vol. 24-25   No. 1
  Vol. 22-23   No. 1
  Vol. 17   No. 1
  Vol. 16   No. 1
  Vol. 15   No. 1
  Vol. 14   No. 1
  Vol. 13   No. 1
  Vol. 12   No. 1
  Vol. 11   No. 1
  Vol. 10   No. 1
  Vol. 9   No. 1
  Vol. 8   No. 1
  Vol. 7   No. 1
  Vol. 6   No. 1
  Vol. 5   No. 1
  Vol. 4   No. 1
  Vol. 3   No. 1
  Vol. 2   No. 1
  Vol. 1   No. 1
  Vol. 14   No. 1
  Vol. 13   No. 1
  Vol. 12   No. 1
  Vol. 11   No. 1
  Vol. 10   No. 1
  Vol. 9   No. 1
  Vol. 8   No. 1
  Vol. 7   No. 1
  Vol. 6   No. 1
  Vol. 5   No. 1
  Vol. 4   No. 1
  Vol. 3   No. 1
  Vol. 2   No. 1
  Vol. 1   No. 1
  Vol. 2   No. 1
  Vol. 1   No. 1

The Signal: Digital Preservation

Emerging Collaborations for Accessing and Preserving Email

The following is a guest post by Chris Prom, Assistant University Archivist and Professor, University of Illinois at Urbana-Champaign.

I’ll never forget one lesson from my historical methods class at Marquette University.  Ronald Zupko–famous for his lecture about the bubonic plague and a natural showman–was expounding on what it means to interrogate primary sources–to cast a skeptical eye on every source, to see each one as a mere thread of evidence in a larger story, and to remember that every event can, and must, tell many different stories.

He asked us to name a few documentary genres, along with our opinions as to their relative value.  We shot back: “Photographs, diaries, reports, scrapbooks, newspaper articles,” along with the type of ill-informed comments graduate students are prone to make.  As our class rattled off responses, we gradually came to realize that each document reflected the particular viewpoint of its creator–and that the information a source conveyed was constrained by documentary conventions and other social factors inherent to the medium underlying the expression. Settling into the comfortable role of skeptics, we noted the biases each format reflected.  Finally, one student said: “What about correspondence?”  Dr Zupko erupted: “There is the real meat of history!  But, you need to be careful!”


Dangerous Inbox by Recrea HQ. Photo courtesy of Flickr through a CC BY-NC-SA 2.0 license.

Letters, memos, telegrams, postcards: such items have long been the stock-in-trade for archives.  Historians and researchers of all types, while mindful of the challenges in using correspondence, value it as a source for the insider perspective it provides on real-time events.   For this reason, the library and archives community must find effective ways to identify, preserve and provide access to email and other forms of electronic correspondence.

After I researched and wrote a guide to email preservation (pdf) for the Digital Preservation Coalition’s Technology Watch Report series, I concluded that the challenges are mostly cultural and administrative.

I have no doubt that with the right tools, archivists could do what we do best: build the relationships that underlie every successful archival acquisition.  Engaging records creators and donors in their digital spaces, we can help them preserve access to the records that are so sorely needed for those who will write histories.  But we need the tools, and a plan for how to use them.  Otherwise, our promises are mere words.

For this reason, I’m so pleased to report on the results of a recent online meeting organized by the National Digital Stewardship Alliance’s Standards and Practices Working Group.  On August 25, a group of fifty-plus experts from more than a dozen institutions informally shared the work they are doing to preserve email.

For me, the best part of the meeting was that it represented the diverse range of institutions (in terms of size and institutional focus) that are interested in this critical work. Email preservation is not something of interest only to large government archives,or to small collecting repositories, but also to every repository in between. That said, the representatives displayed a surprising similar vision for how email preservation can be made effective.

Robert Spangler, Lisa Haralampus, Ken  Hawkins and Kevin DeVorsey described challenges that the National Archives and Records Administration has faced in controlling and providing access to large bodies of email. Concluding that traditional records management practices are not sufficient to task, NARA has developed the Capstone approach, seeking to identify and preserve particular accounts that must be preserved as a record series, and is currently revising its transfer guidance.  Later in the meeting, Mark Conrad described the particular challenge of preserving email from the Executive Office of the President, highlighting the point that “scale matters”–a theme that resonated across the board.

The whole account approach that NARA advocates meshes well with activities described by other presenters.  For example, Kelly Eubank from North Carolina State Archives and the EMCAP project discussed the need for software tools to ingest and process email records while Linda Reib from the Arizona State Library noted that the PeDALS Project is seeking to continue their work, focusing on account-level preservation of key state government accounts.

Functional comparison of selected email archives tools/services. Courtesy Wendy Gogel.

Functional comparison of selected email archives tools/services. Courtesy Wendy Gogel.

Ricc Ferrante and Lynda Schmitz Fuhrig from the Smithsonian Institution Archives discussed the CERP project which produced, in conjunction with the EMCAP project, an XML schema for email objects among its deliverables. Kate Murray from the Library of Congress reviewed the new email and related calendaring formats on the Sustainability of Digital Formats website.

Harvard University was up next.  Andrea Goethels and Wendy Gogel shared information about Harvard’s Electronic Archiving Service.  EAS includes tools for normalizing email from an account into EML format (conforming to the Internet Engineering Task Force RFC 2822), then packaging it for deposit into Harvard’s digital repository.

One of the most exciting presentations was provided by Peter Chan and Glynn Edwards from Stanford University.  With generous funding from the National Historical Publications and Records Commission, as well as some internal support, the ePADD Project (“Email: Process, Appraise, Discover, Deliver”) is using natural language processing and entity extraction tools to build an application that will allow archivists and records creators to review email, then process it for search, display and retrieval.  Best of all, the web-based application will include a built-in discovery interface and users will be able to define a lexicon and to provide visual representations of the results.  Many participants in the meeting commented that the ePADD tools may provided a meaningful focus for additional collaborations.  A beta version is due out next spring.

In the discussion that followed the informal presentations, several presenters congratulated the Harvard team on a slide Wendy Gogel shared, comparing the functions provided by various tools and services (reproduced above).

As is apparent from even a cursory glance at the chart, repositories are doing wonderful work—and much yet remains.

Collaboration is the way forward. At the end of the discussion, participants agreed to take three specific steps to drive email preservation initiatives to the next level: (1) providing tool demo sessions; (2) developing use cases; and (3) working together.

The bottom line: I’m more hopeful about the ability of the digital preservation community to develop an effective approach toward email preservation than I have been in years.  Stay tuned for future developments!

Bill Caraher (The New Archaeology of the Mediterranean World)

Friday Varia and Quick Hits

We may have one more day of summer today with temperatures set to reach a balmy 86 degrees here in North Dakotaland. Do society a favor and don’t call it an “Indian Summer” or “Altweibersommer.” I’m just going to call it a warm day in late September. And, don’t worry, Grand Forks will be back to its sleepy, bucolic fall decline by the end of next week.

In the meantime, when you’re not enjoying the warm days and the gentle patter of a late summer rain, please do enjoy these quick hits and varia.  

IMG 1917I can groove to Duke.

Archeomatica: Tecnologie per i Beni Culturali

NURMAP, in una app la mappa dei nuraghi della Sardegna

nurmap-screenshotE' stata presentata in occasione della Prima Festa della Civiltà Nuragica che si svolgerà domenica 21 settembre a Bonorva, località Mariani- Sa Pala Larga, organizzata in collaborazione con l'Istituto Italiano dei Castelli e il Comune di Bonorva l'applicazione del geoportale NURNET per la conoscenza dei nuraghi della Sardegna.

Cultural Heritage Informatics Initiative

Hello there! CHI Fellowship Intro: Jennifer A. Royston

My name is Jennifer A. Royston and I am very excited to be a CHI fellow this academic year. While I’m not ready to announce my project just yet, I do have some interesting ideas up my sleeve! Stayed tuned…

I am a fourth-year doctoral student in the Department of English. Before coming to MSU I earned an MA in ‘Shakespeare in History’ from University College London. And before that I taught high school English at an International Baccalaureate school. I specialize in Renaissance literature, specifically drama, and the metadramatic function of paintings and painters on the Early Modern stage. I explore why Renaissance playwrights were invested in dramatizing painters, and why visual art was so often staged or otherwise evoked through verbal means. I am especially interested in the rise of English artistic theory and how this body of literature differs from its paragone predecessors, especially when represented on the London stage.

Aside from my research, teaching is my passion. I am fortunate to serve as my department’s teaching coordinator this year. This position allows me to organize and facilitate a series of pedagogy workshops for our department’s graduate students. In addition to the CHI and Department of English Fellowships, I am a RCAH Graduate Fellow; this opportunity allows me to consider my own teaching practices more critically as I develop a unique teaching project over the course of a year. I have experience teaching in the traditional, hybrid, online, and MOOC environments and I continue to think about best practices for each of these formats.

In my free time I like to travel, work out, spend time with my Polish-American family, peruse all things related to fashion and interior design, organize my surroundings (oddly enough), drink coffee, and learn new artistic media (I’m taking a calligraphy class at the moment). A lifelong learner at heart, I enjoy keeping busy by experiencing new things.

Please feel free to keep in touch, especially if you are interested in DH Renaissance projects, or digital pedagogy:

Twitter: @JARoyston
Email: royston7@msu.edu
Webpage: http://jenniferaroyston.weebly.com/

September 18, 2014

Charles Ellwood Jones (AWOL: The Ancient World Online)

Digitales Forum Romanum

Digitales Forum Romanum
Forschungs- & Lehrprojekt des Winckelmann-Instituts der Humboldt-Universität zu Berlin
in Kooperation mit dem Exzellenzcluster TOPOI

Das antike Forum Romanum gehört zu den Hauptattraktionen eines jeden Rombesuchs. Täglich erkunden hunderte von Besuchern das Forum Romanum und lassen sich von der stimmungsvollen Ruinenlandschaft und der historischen Bedeutung dieses Ortes faszinieren: Hier lag das öffentlich-politische Zentrum der antiken Metropole, hier wurde Politik gemacht und Geschichte geschrieben – und entsprechend pulsiert hier für uns heutzutage die Vergangenheit des antiken Roms in einer ganz besonderen Intensität. Doch angesichts der idyllischen Ruinenlandschaft, als welche sich die Ausgrabungsstätte heutzutage präsentiert, fällt es schwer, sich ein wirkliches Bild von diesem antiken Platz zu machen: Wie erlebten ihn die Menschen in der Antike, wie präsentierte er sich als Bühne des politischen Handelns und der gesellschaftlichen Kommunikation, und wie funktionierte er überhaupt konkret als öffentliches Zentrum dieser einzigartigen antiken Metropole? Es sind diese Fragen, mit denen die Ausgrabungsstätte ihre Besucher oftmals alleine lässt. Und es sind die Fragen, auf die wiederum die Klassische Archäologie seit jeher mit Hilfe von Rekonstruktionen Antworten zu geben versucht.

  • Start

  • Forum Romanum
  • Projekt
  • Ressourcen
  • Multimedia
  • Kontakt

  • Kristina Killgrove (Powered by Osteons)

    Holding Hands That Aren't There

    This photo has been circulating wildly of late, purporting to show a couple "that have been holding hands for 700 years" (according to the University of Leicester's press release).

    The dig blog is a bit less, erm, truth-stretchy, labeling them as "a man and women [sic] buried side by side with their arms crossed together."  Which is good because, well, where are their hands?

    Holding hands is a nice story.  And it could be true.  Buuuuut... one corpse's arm could have just been thrown on top of another corpse's arm.  I'd really want to figure out where the hands are (?) and what the precise stratigraphy is first.

    Related:  Holding Hands into Eternity (PbO - 21 October 2011)

    Charles Ellwood Jones (AWOL: The Ancient World Online)

    Daphnet: ILIESI’s Digital Archives of PHilosophical texts on the NET


    Daphnet, the ILIESI’s Digital Archives of PHilosophical texts on the NET, is a portal that gives access to digital platforms dedicated to relevant authors and texts belonging to the history of scientific and philosophical thought. These platforms are characterized by some common aspects: They primarily aim at giving access to primary sources, eventually complemented by secondary sources and critical instruments; they can include both facsimiles and transcriptions of manuscripts and printed texts; they are based on open-source programmes and standard encoding (i.e. html, XLM,...); the text of these platforms can be semantically enriched. Moreover, the platforms are interoperable, open to the collaboration of the scholars, and they are certified by a board of reviewers.

    Presocratics Source
    Presocratics Source presents the transcription of the famous collection of Presocratic thinkers in ninety chapters originally edited by H. Diels and W. Kranz (Die Fragmente der Vorsokratiker, ed. by H. Diels-W. Kranz, 3 vols., Weidmann, Berlin, 19582), with the parallel Italian translation edited by G. Giannantoni (I Presocratici. Testimonianze e frammenti, a cura di G. Giannantoni, Laterza, Roma-Bari, 19832).

    Socratis et Socraticorum Reliquiae Source
    Socratis et Socraticorum Reliquiae Source presents the transcription of the collection of testimonies about Socrates and Socratics (Socratis et Socraticorum Reliquiae) originally edited by G. Giannantoni.

    Diogenes Laertius Source
    Diogenes Laertius Source presents the transcription of Lives and opinions of eminent Philosophers in ten books. Collation of the editions of R. D. Hicks, H. S. Long, M. Marcovich and the Italian translation of M. Gigante with parallel Greek text restored on the bases of his philological notes. The site enable users to access texts, exploit resources, and perform queries. Notes, additional information and a legenda for a better access to the texts are also available.

    dh+lib: where the digital humanities and librarianship meet

    POST: For God’s Sake, Stop Digitizing Paper

    Joshua Ranger has written a post on the AVPreserve blog that calls on archivists (and others) to examine their digitization practices and priorities. Arguing that audiovisual materials are in greater danger of obsolescence, Ranger declares, “We should agree to stop digitizing paper and other stable formats for a set period because, in a way, it is bad for preservation.” Though his focus in on audiovisual materials, Ranger draws attention to the underlying rationale for digitization in general. He notes:

    [A] lot of digitization work is essentially a wasted effort if it needs to be done again for access, or future preservation work, if files, access portals, metadata, and digital humanities projects are lost. And I’m not just saying lost as in the fretting about the unreliability of digital files, but lost due to human failure in managing servers, migrating data, or letting websites go dead.

    The post POST: For God’s Sake, Stop Digitizing Paper appeared first on dh+lib.

    POST: Analysis of Privacy Leakage on a Library Catalog Webpage

    Eric Hellman (unglue.it) has written up a recent presentation at the Code4Lib-NYC meeting in which he performed an “Analysis of Privacy Leakage on a Library Catalog Webpage.”

    Hellman selected a single webpage for a book in the NYPL online catalog and traced “all the requests my browser made in the process of building that page.” Noting that “my browser contacts 11 different hosts from 8 different companies,” Hellman investigates each company’s privacy policy and use of cookies to give an alarming picture of the way that patron browsing data is shared via cloud-based library catalogs. He concludes:

    In 1972, Zoia Horn, a librarian at Bucknell University, was jailed for almost three weeks for refusing to testify at the trial of the Harrisburg 7 concerning the library usage of one of the defendants. That was a long time ago. No longer is there a need to put librarians in jail.

    The post POST: Analysis of Privacy Leakage on a Library Catalog Webpage appeared first on dh+lib.

    RESOURCE: Bookworm for Movies and TV

    Ben Schmidt (Northeastern University) has created an online tool that allows users to “investigate onscreen language in about 87,000 movies and TV shows, encompassing together over 600 million words.”

    Bookworm: Movies draws on a corpus from Open Subtitles and metadata from IMDb to chart changing language use in film and television dialogue, using the Bookworm platform.


    The post RESOURCE: Bookworm for Movies and TV appeared first on dh+lib.

    CFP: Accessible Future Workshop (Lincoln, NE)

    Applications are now being accepted for the 2-day NEH-funded Accessible Future workshop, taking place November 14-15, 2014, in Lincoln, NE.

    The Accessible Future workshop is intended for humanists, librarians, information scientists, and cultural heritage professionals who wish to learn about technologies, design standards, and accessibility issues associated with the use of digital technologies

    This is the third time the workshop is being offered (earlier workshops took place in Boston, MA, and Austin, TX). A fourth and final workshop is scheduled for early 2015 in Atlanta, GA.

    For more information, check out George Williams’ post on ProfHacker.

    The post CFP: Accessible Future Workshop (Lincoln, NE) appeared first on dh+lib.

    JOB: Digital Humanities Specialist, Carnegie Mellon

    From the announcement:

    The Dietrich College of Humanities and Social Sciences at Carnegie Mellon University, with grant funding from the A.W. Mellon Foundation in the Digital Humanities, is seeking applicants for a Research Staff position in the digital humanities. In particular, we are looking for applicants with skills and experience in one or more of the following areas: natural language processing, data mining of large textual corpora, machine learning, approaches to analyzing large textual collections, social network analysis, geographical information systems, statistical analysis, especially in high dimensional spaces, and user interface design, especially visual analytics.

    The Digital Humanities Specialist position will lay a foundation for supporting research by humanities Ph.D. students and faculty in digital humanities. The primary function will be:

    • Consult with Ph.D. and faculty in the humanities in applying automated text processing, statistical data analysis, and machine learning techniques to research questions in the humanities;
    • Educate Ph.D. students and faculty in acquiring analysis skills through the development and delivery of targeted workshops and/or courses.

    Activities will also include the evaluation and recommendation of existing tools and techniques, with the possibility of some development/extension and implementation of algorithms and models. Additional job functions include assisting Ph.D. students and faculty in trouble-shooting problems with tools and publishing papers.

    Please note that this position is grant funded for 4 years with the possibility of further extension. The position will report to the Dean of the Dietrich College of Humanities and Social Sciences.

    The post JOB: Digital Humanities Specialist, Carnegie Mellon appeared first on dh+lib.

    JOB: Digital Scholarship and Scholarly Communication Librarian, Smith College

    From the announcement:

    Lead digital scholarship initiatives for the Smith College Libraries’ Teaching, Learning and Research (TLR) department, working closely with colleagues engaged in the development of digital scholarship at Smith.  Promote the adoption of new models of scholarship at Smith by developing methodologies and tools of the digital humanities and social sciences for research, teaching and learning.  Provide instruction and consultation services for digital projects, and serve on project teams.  Facilitate the use of library content for digital scholarship creation by faculty and students. Serve as a resource for other TLR librarians who are engaged in digital scholarship; provides training and consultation.  Keep abreast of new developments in digital scholarship and scholarly communication in the humanities and social sciences, and promote awareness among colleagues.  As a member of the Teaching, Learning and Research department, conduct general reference, outreach, library instruction/information literacy, collection development and liaison activities.

    The post JOB: Digital Scholarship and Scholarly Communication Librarian, Smith College appeared first on dh+lib.

    Charles Ellwood Jones (AWOL: The Ancient World Online)

    Open Access Monograph Series: Greek, Roman, and Byzantine Monographs

    Greek, Roman, and Byzantine Monographs
    Greek, Roman, and Byzantine Monographs 
    1. G. L. Huxley, Anthemius of Tralles: A Study in Later Greek Geometry.  1959. [Link]
    2. Emerson Buchanan, Aristotle’s Theory of Being. 1962. [Link]
    3. Jack L. Benson, Ancient Leros. 1963. [Link]
    4. William M. Calder III, The Inscription from Temple G at Selinus. 1963. [Link]
    5. Mervin R. Dilts, ed., Heraclidis Lembi Excerpta Politiarum.  1971. [Link]
    6. Eric G. Turner, The Papyrologist at Work.  1973. [Link]
    7. Roger S. Bagnall, The Florida Ostraka: Documents from the Roman Army in Upper Egypt.  1976. [Link]
    8. Graham Speake, A Collation of the Manuscripts of Sophocles’ Oedipus Coloneus.  1978.
    9. Kevin K. Carroll, The Parthenon Inscription.  1982. [Link]
    10. Studies Presented to Sterling Dow.  1984. [Link]
    11. Michael H. Jameson, David R. Jordan, and Roy D. Kotansky, A Lex Sacra from Selinous.  1993.

    Greek, Roman, and Byzantine Scholarly Aids
    1. Index of Passages Cited in Herbert Weir Smyth Greek Grammar.  Compiled under the direction of Walter A. Schumann.  1961. [Link]
    2. Sterling Dow, Conventions in Editing.  1969. [Link]

    Out of series
    A Generation of Antiquities: The Duke Classical Collection 1964-1994 (1994). [Link]

    AMIR: Access to Mideast and Islamic Resources

    Ottoman Diplomats : Letters from the Imperial Legation in Brussels (1849–1914)


     Ottoman Diplomats : Letters from the Imperial Legation in Brussels (1849–1914)

    "Ottoman Diplomats is a digitization project of the research group Power in History: Centre for Political History at the University of Antwerp (UA). It offers online access to a selection of diplomatic documents from the Imperial Legation in Brussels (1849–1914).
    Next to telegrams and periodic diplomatic reports by Ottoman diplomatic agents (dispatches, dépêches) in Brussels and replies and instructions by their superiors in Istanbul, the collection also contains some letters by Ottoman consuls in Belgium and by Ottoman ambassadors or ministers in other European capitals.
    The letters are in French and their content is very diverse: reports dealing with Belgian and international society, politics, commerce, industry and finance, but also letters about matters of protocol, or internal issues of the Ottoman Mission."

    Source: Journalism Code, Context & Community

    Gender, Twitter, and the Value of Taking Things Apart

    By Jacob Harris

    Gender, Twitter, and the Value of Taking Things Apart

    Jake Harris's Twee-Q score

    It's impossible to deny there is serious gender inequality the world of journalism. Similarly, the world of computer programming is marked with even more serious skew in its gender balance. Data journalism is firmly seated in the intersection between those two fields. Where does that leave us? In a state of eternal vigilance about its gender balance and diversity as it should be.

    This isn't an essay about how to address gender inequality in digital newsrooms. I don't claim to know the answers and I expect others can provide better guidance than me. For me, one essential first step is the simple act of making people aware that gender imbalances are real and have consequences on who is heard. Some developers in Sweden created Twee-Q to address awareness of the gender problem for Twitter users. The interface is pretty simple. You submit a Twitter account for analysis (yours or possibly someone else's) and it scans that account's most recent 100 tweets for all its retweets. It then checks the names of the retweeted accounts to see which are male or female (ambiguous names are ignored). Once it has a count of male and female retweets among the last 100 tweets from your account, it can tell you how much you unconsciously prefer retweeting one gender over the other in your tweets, in the hopes that you'll try to rebalance who you follow and be aware if you are favoring one gender's perspective on the world.

    Naturally, I tried it out on myself. Oof. The screenshot above shows how dismal my own score was. I now knew I could do better. But I also immediately wanted to figure out how Twee-Q worked. Of course, this was somewhat in the hope I could identify some fault in their approach to blame for my own poor performance. But also I was interested in rebuilding it because it makes the same choices any other programmatic analysis of Twitter does: finding a balance between speed and accuracy. This project is an object lesson in the thrills and pitfalls of using tweets as data--and in the value of reverse engineering the creation of data as a way of evaluating its validity.

    The Tweequality Application

    Which is ultimately why I built my own version. So I could understand their design decisions and explore if they affect the final analysis.

    As someone who is slightly obsessed with Twitter data, I know a lot about this. Twitter has always been notable for its extensive API access, but the service's growth has also necessitated that the company place limits. In the early days, scripts were only required to log in if they were performing "write" actions like posting a tweet or following an account; now some form of authentication--blanket credentials for an application or user-granted permission for applications to view their accounts--is required for every endpoint in the API, with varying rate limits on how much any given method can be queried in a 15-minute window. These limits can slow down things immensely for apps. For instance, Twitter's new analytics portal allows you to see the gender breakdown of your followers – for me, it reports 72% of my followers are male, 28% are female – but what if you wanted to calculate something like that for a large account like @nytimes. The bulk followers method lets you retrieve 5000 user IDs every minute. To retrieve all 12.9 million followers of the nytimes account would take 2580 minutes or about 1.8 days. And this method only returns user IDs. To actually retrieve information about each user requires a different method that allows up to 400 user records to be downloaded per minute, which would mean waiting 22.4 days to calculate the gender breakdown of that account on your own. Worse still, some API methods contain hard limits. For instance, the user/timeline method can return a maximum of the 3200 most recent tweets for a given user. To give some context, I have posted over 85,000 tweets in my life, meaning at most 3.7% of my entire timeline can ever be analyzed by programs and I keep reducing that proportion with every tweet I write.

    Admittedly, @nytimes is an extreme outlier, but forcing a wait for even a few minutes for an analysis makes your apps more complex and people less likely to wait around. Furthermore, the only viable means for most aplications to use the API is to request users login and grant them permissions to read and sometimes write to their accounts. This process is somewhat convoluted and many users might balk at the screen asking if an application can have permission to look at their account. This is why most applications choose speed over accuracy, often looking at only the most recent tweets or followers for any given account. In the case of Twee-Q, they bypass any form of application or user authentication at all and look only at the most 100 tweets for any account to make their calculation.

    How accurate can that be? It's a bit like owning a fitness tracker that has only remembers a single day or even only a few hours. Looking at only the most recent window of events is not necessarily wrong, but it's also not exactly the same as using a complete data set or a randomized sample. Admittedly, Twee-Q is more of a toy--albeit a toy with a social message – than a news application like we normally cover here at Source. But of course many news organizations have built similar widgets of their own. I was curious what it would be like if those API limits were different, so I tested with my own personal local Twitter API sandbox by downloading my complete tweet archive.

    Revisiting the Past

    I was able to do this thanks to a feature Twitter rolled out in the past year, the ability to download and browse your own complete archive of tweets. They provide the archive in two main formats for consumption: a basic CSV of all your tweets and a dynamic interface you can open locally in your web browser. The beauty of the latter is that it loads all of its data from a separate directory of JSON data files organized by month and year. In addition, Twitter has put some effort into identifying retweets (both automated and manual) and flags them in the data by transcluding information about the original tweet (or status in Twitter's jargon) into the JSON for the retweet message like this:

    "retweeted_status" : {
        "source" : "\u003Ca href=\"http:\/\/www.apple.com\" rel=\"nofollow\"\u003EiOS\u003C\/a\u003E",
        "entities" : { ... },
        "geo" : { },
        "id_str" : "469866710192128000",
        "text" : "Andreessen bot responds to journalism job postings with \"this should be a bot.\" http:\/\/t.co\/Sf6N5NVf1l",
        "id" : 469866710192128000,
        "created_at" : "2014-05-23 15:45:28 +0000",
        "user" : {
          "name" : "Lois Beckett",
          "screen_name" : "loisbeckett",
          "protected" : false,
          "id_str" : "21134925",
          "profile_image_url_https" : "https:\/\/pbs.twimg.com\/profile_images\/2187277560\/loispp_normal.jpg",
          "id" : 21134925,
          "verified" : true

    All of which makes it pretty easy to identify which tweets in your timeline are retweets and the original users who wrote them. So, that is basically how my version of the Twee-Q algorithm works. By running it against my entire archive, I can explore three questions I had about the Twee-Q algorithm:

    1. How accurate is it to guess the genders of twitter accounts anyway? How much do bots and brands interfere with the process?
    2. What difference would it make in the final calculation if the Twee-Q algorithm was able to look back at more tweets than 100? Would a slightly larger sample have a bigger effect?
    3. Just how reliable can a single measurement on 100 tweets be? Does the ratio stay pretty consistent or vary wildly from day to day?

    To answer these, I wrote two scripts that work once you download your tweet archive from Twitter and save it to a subdirectory in the project:

    1. analyze_gender.rb-runs through the archive guessing the genders of every account retweeted. These guesses are saved to a separate CSV file in the tweets directory with a second column that allows you to correct any miscategorized accounts.
    2. analyze_retweets.rb-runs through the archive analyzing the timeline. It will tally the gender miscategorizations recorded in the CSV file first. Then it anaylzes all tweets before outputting two several CSV files to help answer questions 2 and 3.

    The Problems with Guessing Gender

    One of the appeals of Twitter is that you don't need to share much about yourself to start talking. Twitter doesn't require users to reveal much of their personal identity, providing only a few sparse fields – a name, a short bio, location – that can be set to anything (or nothing) that users might choose. This is in such stark contrast to Facebook's ethos of capturing every possible connection between its users that it might seem ludicrous to use Twitter as a basis for any demographic study at all. However, Twitter's focus on public conversations instead of private connections is what makes it irresistible for people studying topics like political speech, hate speech, breaking news and global events. All of these case studies involve inferring some sort of demographic detail from the meager data provided by users. How well can that work?

    Gender seems like it would be an easy thing to infer at first glance. Unlike political orientation for instance, it doesn't require looking at a user's tweets or connections; it's as simple as comparing the name on the account to lists of known male and female names and guess based on that. Twee-Q used lists from a few national censuses. I don't have their exact list, but there is a Ruby gem named sex_machine that provides the same functionality and also meets the Ruby world's penchant for picking wildly immature names for software libraries. Given a name, Sex Machine makes a guess whether the name is male, mostly male, female, mostly female or unclear. Unclear cases could be things like brands (can you really gender The New York Times?) as well as names used relatively equally by both genders (like Courtney or Lindsey).

    The analyze_genders.rb script runs the sex_machine gem on all the retweets in your archive and then dumps a CSV that can be hand-checked and corrected with the actual gender of all accounts. In the case of my own Twitter history, I was surprised to see that my gender analyzer guessed wrong for around 15% of the names it encountered. Here is the detailed breakdown of error types.

    Guessed Actual Count
    Female Male 20
    Female None 43
    Male Female 17
    Male None 12
    None Female 228
    None Male 341

    It's clear that the vast majority of those mistakes happened when the classifier was unable to guess the gender of an account rather than misidentifying a male as a female or vice versa. I noticed a few reasons why this would happen:

    • Although I tended to mostly retweet Western-style names, the gender analyzer was generally flummoxed by other types of names.
    • A sizeable number of accounts did not provide a real name at all. In these cases, I guessed the actual gender by looking at the avatar photo, but that process is obviously error-prone.
    • Of those accounts that did not provide a name, the bulk of them simply repeated their Twitter username in the name field. The extent of this practice surprised me.
    • In a quirk that is possibly specific to the sex_machine gem, all accounts that started with "The" like "The New York Times" were misidentified as female names. Not enough to distort the errors significantly but it does show one way this process can be thrown off.

    Admittedly, these observations are specific to my tweeting patterns, but the number of gender misidentifications was far higher than I expected. Moreover, the majority of errors were actually failures to guess any gender, not my script confusing male for female or vice versa. Assuming my retweets are an accurate sample of Twitter as a whole, a sizeable number of Twitter users obscure their online identities in some basic fashion. In most of these cases, a simple glance at the user avatar reveals the user's gender if it's actually the user's photo and that's not an accident. Given the aggressive nature of spambots that plague Twitter, it makes sense to be a little coy with fields a machine can parse while still being upfront with photos a human can understand. And of course, sometimes a user's identity is entirely fabricated. I doubt that @subtweetcat is an actual cat, for instance.

    Ambiguity is unavoidable. This doesn't make Twitter research meaningless. It just means that researchers should accept and be upfront with their readers that there will always be some level of fuzziness in their analysis. Twitter's own analytics tool presents an illusory certainty about the gender breakdown of my followers, with no explanation of how they handle unclear cases. Reality is much more complicated.

    A Window into My Soul

    How much of an effect does the limited window used by Twee-Q affect its results? Putting things formally, the goal of any analysis like Twee-Q is to infer a specific characteristic about my entire Twitter usage from only looking at a small window on my timeline. The analyze_retweets.rb script explores the question of accuracy informally by chugging through my timeline in reverse chronological order (ie, how any bot would) and outputting two CSV files that show:

    1. What are the effects of a using an ever-larger window? This CSV recalculates the male/female retweet ratio at ever-expanding windows back from the most recent tweet to see how it changes.
    2. How variable is any given 100-tweet window? This CSV recalculates the male-female ratios for a sliding window of 100 tweets starting with each prior day in the timeline to see how wildly the score varies from one day to the next

    I wish I had a solid enough understanding of statistics to provide exact answers to either of these questions. Instead, I can only observe the hallowed digital journalism tradition of presenting several charts with commentary. For starters, here is the adjusted average when the window looking back increases. The Pct value represents the percent of accounts that I have retweeted that are male vs. female. Accounts with no specific gender are not included.

    Line graph

    It is no surprise that the computed percentage converges to a value eventually. In this chart, the average swings a bit wildly at first when the window is low before steadily climbing upward from 0.65 to a final value of 0.6992 – ie, as I go further back in time, the percentage of my retweets that are from men increases. The straight line in the end doesn't indicate a dogged consistency from early Jake, but rather the end--or rather, the beginning--of Twitter's retweet mechanisms (manual and native retweets). But before it flatlines, the curve takes a surprisingly long time to amble upwards toward the eventual average that represents the score for all of my tweets. Why? I don't know for sure, but I think the answer has something to do with the second CSV generated by that same script. It simulates the effect of running the Twee-Q calculation on each separate day of my timeline, looking at the prior 100 tweets starting at that day.


    I wrote this script originally because I was curious just how accurate only looking at 100 tweets from an account could be. Twee-Q is admittedly an extreme case, but every analysis of Twitter accounts involves the same tradeoff; they usually examine only a tiny subset of a user's tweets. We'd like to think that this little sample mirrors the properties of my entire timeline, but it is also pretty variable, as the chart makes clear. Still, could you use this approach on a few random days to average out the variation and get a better estimate? Yes, but that would probably be wrong. All of my meager statistical chops are based on the assumption that any two events are statistically independent, and these scores aren't independent. If I know the score for one day, I can make a few rough guesses about the score the next day, since that will involve some of the same retweets in the prior day's score. These scores are less like truly independent samples and more like a moving average. I'm sure there is some cool analysis to be done with this, but that's beyond my own skill level in statistics.

    An alternative is to make each sample independent. Just for kicks, I created a third CSV file sample.csv that repeatedly picks a random 100 tweets from my entire timeline and computes the male-female retweet percentage from that selection. Each of these runs is truly independent from the next, which makes things a bit more palatable for statistical analyses. And indeed, if you compare histograms of the spread between the sliding and sample, the latter more resembles the Normal Curve we would expect, while the former is just unbalanced and weird. This affects the resulting calculations too. Contrast the summaries of these two approaches:

    Method Min. 1st Qu. Median Mean 3rd Qu. Max. Std. Dev.
    Sliding 0.2162 0.6111 0.6857 0.6798 0.7500 1.0000 0.1153777
    Sampled 0.2941 0.6316 0.7037 0.7002 0.7692 1.0000 0.1012754

    The resampled version has less spread and an average that is far closer to the correct population average of 0.6992 which the first CSV converged to. A difference of roughly 0.2 might seem like no big deal, and indeed this is a bit silly, but it seems important too. The sliding method simulates exactly what would happen if I ran Twee-Q's query against my timeline on successive days, and it yields a worse result than true sampling, but it's the only automatic mechanism that Twitter provides for applications.

    So What?

    How much do you retweet men vs. women? What percentage of your followers are bots? Did Mitt Romney buy Twitter followers? What kind of personality do you have? These are all questions that have been asked about Twitter by recent tools and stories. And all of them have inferred their answers by looking at only a limited selection of the a user's Twitter information, whether it's the 1000 most recent tweets, or the 5000 most recent followers, or the like. Could we do better?

    In an ideal world, the Twitter API would provide sampling equivalents to the user_timeline, favorites/list, followers/list, search/tweets and any other endpoints whose API limits are quickly exhausted for any notable accounts. There already is a sampling endpoint for the streaming API, although some have questioned whether that sampling is truly unbiased. To easily and accurately answer questions like those above and others, sampling versions of these other endpoints would be better, particularly when accounts of interest have scaled so far beyond what the API restrictions allow to be investigated in a reasonable time period.

    Without that, what are the best practices for deriving data from Twitter? Can we fake sampling the data with the tools we have? Is looking at 100 tweets good enough? Would picking 400 for instance be much better? Statistics suggests that's only twice as good, but would the increased API usage and quality data mean less quality improvement than that? Are there tricks we can try with the search API to get past the hard limits of the user_timeline method? What are some other pitfalls we should be wary of when using social media data? These are good questions best tackled by someone much more talented at statistics than most journalists playing around with Twitter (including me). Luckily, we are not alone. Twitter has overwhelmingly become the choice for researchers investigating social media, and with that has come some honest acknowledgment of big problems and possible approaches. Maybe a team of social researchers, programmers, and journalists could figure out the best ways to answer the same questions we find ourselves asking about Twitter.

    Finally, I think it's important to remember that Twitter users are (mostly) human beings who have express reasons for preserving their privacy. I originally wanted to share my corrected list of genders for the Twitter users I've retweeted under the ethos of "showing my work." But doing that would mean sharing a machine-readable file that bypasses some of the obfuscation that Twitter users have chosen precisely to avoid being easily analyzed by spambots and other programs. One can imagine other situations involving Twitter where sharing the data might mean inadvertently becoming part of the problem. Simply put, are there ways we can be as transparent with our work while also reflecting the privacy of the users we are researching? What about our tools? How responsible are we if someone uses our Twitter widget to report "facts" about other Twitter accounts derived from opaque methodologies? Again, I hope there is some insight from other social research fields on how to balance the need to protect user privacy with being upfront about our own needs.

    Center for History and New Media

    IMLS Funds Omeka Everywhere

    The Roy Rosenzweig Center for History and New Media at George Mason University, in partnership with Ideum and the University of Connecticut’s Digital Media Center, is pleased to announce that it has been awarded a National Leadership Grant for Museums from the Institute of Museum and Library Sciences to create Omeka Everywhere. Dramatically increasing the possibilities for visitor access to collections, Omeka Everywhere will offer a simple, cost-effective solution for connecting onsite web content and in-gallery multi-sensory experiences, affordable to museums of all sizes and missions, by capitalizing on the strengths of two successful collections-based open-source software projects: Omeka and Open Exhibits.

    Currently, museums are expected to engage with visitors, share content, and offer digitally-enabled experiences everywhere: in the museum, on the Web, and on social media networks. These ever-increasing expectations, from visitors to museum administrators, place a heavy burden on the individuals creating and maintaining these digital experiences. Content experts and museum technologists often become responsible for multiple systems that do not integrate with one another. Within the bounds of tight budget, it is increasingly difficult for institutions to meet visitors’ expectations and to establish a cohesive digital strategy. Omeka Everywhere will provide a solution to these difficulties by developing a set of software packages—including Collections Viewer templates, mobile and touch tablet applications, and the Heist application—that bring digital collections hosted in Omeka into new spaces, enabling new kinds of visitor interactions.

    Omeka Everywhere will expand audiences for museum-focused publicly-funded open source software projects by demonstrating how institutions of all sizes and budgets can implement next-generation computer exhibit elements into current and new exhibition spaces. Streamlining the workflows for creating and sharing digital content with online and onsite visitors, the project will empower smaller museums to rethink what is possible to implement on a shoestring budget. By enabling multi-touch and 3D interactive technologies on the museum floor, museums will reinvigorate interest in their exhibitions by offering on-site visitors unique experiences that connect them with the heart of the institution—their collections.

    AMIR: Access to Mideast and Islamic Resources

    Open Access Manuscripts Library: St. Cyril and Methodius Digital Library

    "The digitalization of valuable materials from the St. St. Cyril and Methodius National Library collections started in 2006. In autumn 2007 these digital images became accessible for users through the library’s website. By the beginning of 2014 have been digitalized and included in the Digital Library nearly 330 000 files - manuscripts, old printed books, unpublished documents from the Bulgarian Historical Archive and Oriental Department, portraits and photos, graphical and cartographical editions, Bulgarian newspapers and journals from 1844 to 1944 – images and their metadata. Users could search in the Electronic archive through the specialized system DocuWare. The digitalized originals are on free access for users.
     The Digital Library is structured in several collections according to the type and the chronological scope of the included documents. The different collections are subdivided in separate sections..."

    St. Cyril and Methodius Digital Library

    Bill Caraher (The New Archaeology of the Mediterranean World)

    Craft in Archaeology: The Craft of Pottery Analysis in Mediterranean Archaeology

    This is the second installment in a series of blog posts focusing on craft in archaeology. Here’s a link to the call for submissions. The posts will explore craft in archaeology from the perspective of field practices, analytical and interpretative frameworks, and social impacts on the discipline. The posts will appear every Thursday for as long as we get contributions and compiled into a e-book by The Digital Press at the University of North Dakota.

    Scott Gallimore, Wilfrid Laurier University

    The idea of archaeology as craft is intriguing. Archaeology is a discipline which combines a number of elements from the humanities, social sciences, and sciences. We ‘borrow’ methodologies perhaps more so than any other field, combining them to form a coherent body of theory and method. Archaeology is not standardized across the world, however, and a number of sub-fields exist, divided by geographical and chronological boundaries. Classical Archaeology is interesting to consider in the context of craft, given its strong historical connections to art history and antiquarianism. Scientific perspectives, such as New Archaeology, have not had as strong an influence on classicists as in other areas of archaeology. How does this affect our view of Classical Archaeology as a craft?

    This post will focus on one aspect of Classical Archaeology as craft: the analysis of pottery. Ceramic specialists are an important component of any project, often dealing with the most robust and copious body of material collected. In many ways, pottery analysis adheres strongly to ideas of craft as they are espoused in the article by Shanks and McGuire (1996). Consideration of the use of an apprentice structure for training specialists, the increasing integration of technology, and the place of pottery specialists within the hierarchy of archaeology, for instance, sheds light on this issue. The analysis of pottery in its present form arose out of nineteenth century methodologies and, in many ways, adheres to this structure. It is not stagnant, however, and is a craft that continues to evolve. The discussion below will hopefully show the benefits and difficulties with this evolution.

    An Apprentice Structure

    Since the formalization of Classical Archaeology as a discipline, the training of scholars to become specialists in the analysis of particular materials has followed an apprentice system. Pottery analysis, for example, relies on passing knowledge from experienced to non-experienced scholars. This adheres to the ‘traditional scheme of archaeological fieldwork’ according to Shanks and McGuire (1996: 84). Labs often have one or two trained specialists who are assisted by students. The students gain experience working with the ‘masters’ and in some cases may even become specialists themselves.

    Shanks and McGuire note that the alternative to the apprentice structure, the factory model, has developed mainly within the jurisdiction of contract and rescue archaeology (1996: 84). The apprentice structure, which promotes the training of students as one of its primary goals, is not as effective in the context of Culture Resource Management. Instead, hiring pre-trained students who can then be assigned to various tasks which in combination bring about an efficient completion to a project is preferable. Proponents of this model within the academy tend to be associated with the New Archaeology, with its greater emphasis of scientific approaches to the discipline.

    For Mediterranean archaeology, is a factory model feasible? The primary goal of this model is to increase efficiency by standardizing the methodology and dividing tasks across a series of workers. It favors a top-down structure where the project director or directors would be the only ones familiar with every aspect of the work. To one extent, some aspects of this model already exist within our own field. I am likely not alone in sometimes feeling separated from many components of a project by spending most days in a lab. There is a disconnect that arises from focusing on a specific set of data collected by a project. On most projects in the Mediterranean, however, it is unlikely that many individuals have a command of every task being completed. Directors often spend most of their time in the field or in the lab and may not be familiar with the other. Most directors do not view themselves as CEOs of project who require oversight into every minute detail. Thus, even though pottery specialists may feel marginalized at times, we are not alone in this feeling.

    Pottery analysts are also moving toward a greater degree of standardization. This is particularly true for the study of fabric, as noted below, and is apparent in other ways, such as the use of distinct terminology. We can only push this so far, however. For decades, pottery analysis was not a standardized field and the number of unique typologies and descriptive methodologies that arose makes almost any overarching standardization impossible. The study of Roman pottery has many of examples of this phenomenon. If we take amphorae, for instance, many of the most common vessel types encountered across the Mediterranean have a remarkable number of names. The Kapitän II, a third–fourth century A.D. wine container perhaps produced in or around Asia Minor, is also known as the Niederbieber 77, Peacock and Williams class 47, the Benghazi Middle Roman 7, the Zeest 79, the Kuzmanov 7, and the Hollow Foot Amphora. Trying to research a vessel type when it is part of so many different typologies can almost be an act of futility. It also suggests that no matter how much standardization is introduced into pottery analysis, there must also be flexibility to engage with these historical precedents and to train students in understanding the complex past of the discipline.

    The history of pottery analysis in Mediterranean archaeology indicates that an apprentice system is still the best system for training individuals to study this material. Hands-on practical experience under the supervision of an experienced instructor is necessary both for learning about the standardized practices that are now in use and about the myriad variations to these practices that appear in older publications and that are still relevant to the field today.

    Technology and Pottery Analysis

    In his proposal for this blog series, Bill Caraher noted that one significant issue for understanding the role of craft in archaeology is the ever-increasing presence of technology. He asked whether the use of this technology could ‘…marginalize opportunities for engagements grounded in craft.’ Pottery analysis is not immune to the technological revolution. Consideration of how this affects ceramic specialists is lacking, however. One risk with engaging more with technology is that it will erode away traditional skill bases in favor of more efficient (but not necessarily more effective) methodologies. Assessment of the types of technology employed by pottery analysts, and their goals in doing so, suggests an opposing view. Use of technology may actually augment the skills we are required to possess since effective use of this equipment requires keen understanding of the material we are studying.

    An example of the interaction between pottery specialists and technology can be found in the study of fabric. In the preface to their book Amphorae and the Roman Economy: An Introductory Guide, David Peacock and David Williams make the following comment:

    Another feature of this book is the stress upon fabrics as well as forms, because we feel that a consideration of both facets is essential if amphorae are to be identified with the precision that now seems necessary in economic analysis. We make no apology for including details of the characteristics of fabrics as they appear in the hand specimen and under the microscope, for this aspect is all too often neglected (1986: xvii).

    For a pottery specialist working in the Mediterranean today, the assertiveness of Peacock and Williams’ view toward including details about fabrics is surprising. Now it would be the scholar who does not engage in fabric analysis who would have to apologize and justify his or her position. The study of fabric has become an essential component of ceramic analysis and one that has been aided greatly by technological innovation.

    A number of archaeometric methods, both chemical and mineralogical, have been brought to bear for analysis of fabrics. Petrography is the most ubiquitous. Developed originally as a tool for studying soil and stone, petrography has a long history in the study of archaeological ceramics. Anna O. Shepard was an early proponent during her work in the southwestern United States (1942). Petrography was also in use by Classical Archaeologists around the same time (e.g. Felts 1942). The technique was not widespread, however. It is only within the past two decades that Mediterranean pottery specialists have come to include petrography as a standard part of their analytical program. Much of this is owed to Peacock who promoted the advantages of petrographic analysis in much of his early scholarship (e.g. 1970: 379).

    Petrography has several advantages over other archaeometric techniques. It is relatively inexpensive, for instance, which is an important consideration within the present climate of dwindling funding.  The technique also provides a wider array of data about ceramics than most archaeometric methods, a detail noted by Peacock: ‘…the potential of petrology has been widely appreciated but recently other methods, more readily automated, seem to be favoured, even though the results may not have the same range of archaeological implications’ (1977: viii). In addition to providing information about the fabric that can lead to determinations of provenance, petrography can shed light on manufacturing processes, including the selection of raw materials, firing techniques, forming processes, and decoration (Peterson 2009: 2). More data is never a bad thing, which is perhaps why petrography has become so popular.

    We must also consider Peacock’s comment about many of these techniques being automated. In other words, to what extent do pottery specialists actually engage with this technology? Petrography is again an interesting example since much of the analysis is done by trained petrographers and not by pottery specialists. We see the results of the study and incorporate them into our own analysis of the finds, but do not necessarily stare into the microscope on a regular basis. How does this affect our view of pottery as a craft? Is there a risk that archaeometric methods like petrography are beginning to replace the need for qualified specialists to examine ceramic assemblages? The answer to the latter question must be no. We can consider a scenario, for instance, where a ceramic assemblage is laid out with the intention of taking samples for petrographic analysis. A pottery specialist trained in analysing macroscopic qualities of fabric and shape is far more effective at selecting a representative sample of sherds from the pile. Moreover, the increased desire to use scientific techniques for studying pottery now requires pottery analysts to be much more vigilant in their study of the material. Detailed descriptions of fabric are now the norm in addition to careful division of the assemblage into known and unknown fabrics, with further subdivisions based on identified or suspected regions of production.

    The need for more standardization and greater detail in fabric analysis is of great benefit to the discipline. One element of pottery studies that has always been frustrating is the poor quality of macroscopic fabric descriptions in much of the literature. They tend to relate vague overviews of color, inclusions, and texture. Comparing such descriptions to material under analysis or across different publications often proves disappointing. Efforts to develop standardized descriptions are helping to alleviate this and more and more publication of petrographic data and photographs of fabrics facilitate comparisons between sites and regions. Portable digital microscopes have also been helpful for improving the quality of fabric photographs in publications.

    Concern that technology may erode the skills of individuals engaged in pottery analysis is not tenable. Even if pottery specialists do not engage with this technology directly, effective use of these methods prompts pottery specialists to improve their own descriptions and analyses of the material to ensure the best data possible is obtained by use of these techniques. Barring the invention of a Star Trek-like scanner that instantly provides all necessary details about a sherd, no amount of technology will replace the need for trained specialists to examine material by hand in a lab. Thus, the craft of pottery analysis should continue to exist in its present form for at least the near future.

    A Field Divided

    The use of technology may be beneficial to pottery analysis as a craft, but there are other issues to consider. One topic that appears several times within the article by Shanks and McGuire is the degree of hierarchy present within archaeology (1996: 82, 84). They observe that ‘we divide the practice of archaeology into those of us who manage and sit on committees, synthesize, generalize, and theorize and those of us who sort, dig, and identify’ (1996: 82). Pottery analysis would tend to fall into the latter category. Since the term ‘hierarchy’ has connotations of rank and status, a fact discussed explicitly by Shanks and McGuire, we must consider how this affects our view of pottery analysis as a craft.

    At a basic level, there are three primary goals during the analysis of a ceramic assemblage. All relate to types of data that can be extracted from the material: chronology, function, economy. Pottery is the most important tool for dating in both excavation and survey. The make-up of an assemblage provides information about activities carried out at a site or within a specific structure and the origin of this material can shed light on economic patterns. Pottery specialists collect and organize this data. What happens afterward is where issues of hierarchy come into play.

    Standard models of publication in Mediterranean archaeology would seem to support Shanks and McGuire’s view of an established hierarchy. In multi-author site reports, analysts present their data, but rarely offer significant synthesis of this material. That synthesis is left for project directors or other scholars who pull together disparate strands of information. Even when site reports involve multiple volumes, with artifact classes presented as separate monographs, pottery specialists often do themselves disservice. A typical pottery volume in Mediterranean archaeology is organized into a contextual introduction that describes the project in question, a detailed catalogue of finds, and a succinct overview of economic implications. It is the final section which reinforces the position of pottery specialists more as identifiers rather than synthesizers. Those final sections tend to range from several paragraphs to several pages and rarely go beyond a superficial treatment of the material. Detailed synthesis is left for other volumes in a series or for other scholars engaged in overarching studies of a region or period. There are a few exceptions (e.g. Peña 1999), but most studies fall into this type.

    The analysis of pottery is a craft that requires mastery of a number of different skills. Focus on typology, chronology, function, and provenance, however, can serve as a barrier to moving beyond description into more detailed interpretation. Time constraints are also relevant since it takes a significant amount of time to process the hundreds, if not thousands, of kilograms of pottery produced by many projects. As the ability to obtain permits becomes more difficult across the Mediterranean and with pressure mounting to disseminate results more quickly, limitations on time, and thus on the ability of pottery specialists to interpret the collected data, will only increase.

    At the end of their article, Shanks and McGuire argue that archaeologists have an ‘…obligation to take responsibility for what we do and produce’ (1996: 85¬–6). Pottery analysts working in the Greco-Roman world do appear well aware of their purpose within an archaeological project. We produce vital data to complement and augment interpretations developed out of field work and the processing of other materials. The question I am asking here, though, is whether pottery specialists should take on more of the responsibility for interpreting this data. We have the closest connections to this material, engaging with it day after day. Is it not possible for the identifiers to also be synthesizers and vice versa?

    The hierarchy and strict division of archaeologists into different specialists has also led to another critique of the discipline. In an article from the late 1990s, Penelope Allison addressed one of the problems inherent in the analysis of material culture by archaeologists. She began with a concise summary of the standard procedure with which artifact analysis is approached by Mediterranean archaeologists:

    At present, a common pattern of post-excavation activity is to divide the excavated artefacts into what are now well-established categories. Each category is then assigned to a different “finds specialist” for organisation into a typology which is ultimately published in the excavation report. The categories are largely selected on criteria attributable to the formal or manufacturing characteristics of the artefacts (1997: 77).

    Allison’s main critique is that this methodology does not reflect how objects and individuals interacted in antiquity. In other words, separating pottery from glass, bone, architecture, etc. hinders rather than helps us to reconstruct ancient behavior. It was Allison’s own frustrations in reading through countless site and artifact reports during a study of households at Pompeii that led to this appraisal. A related difficulty is the fact that after pottery, most artifact classes are relegated to the category of small finds and given far less rigorous treatment. This pattern has been steadily changing over the past few decades, in no small part thanks to a book published by James Deetz on the importance of small finds in American archaeology (1977), but the disparity is still evident.

    For Allison, a more appropriate procedure involves a holistic approach to studying the archaeological record. All material culture, including pottery, should be analyzed and presented together. She advocates the use of database management programs to organize these vast and disparate sets of data, a process which has now become standard practice for many archaeological projects. Scholars interested in domestic architecture have been the primary proponents of Allison’s ideas, following her seminal study of Pompeian households (2004). This includes Brad Ault’s work at Halieis in Greece (2005) and Ben Costello’s recent study of the Earthquake House at Kourion, Cyprus (2014). Most field projects, however, continue to separate their finds and bring in multiple specialists, who are not always present at the same time. 

    Allison proposed this alternative to traditional practice in Mediterranean archaeology fifteen years ago, but for pottery analysis there has been little movement to modify its traditional structure. It is a sub-field that has seen its own skill set expand over the past two decades with the greater integration of technology. The accusation of pottery specialists being myopic in studying a single class of artifacts is perhaps tenable, but is myopia a bad thing if it means the ability to extract the maximum amount of information from a ceramic assemblage? Can an individual who spends equal time learning about ancient pottery, glass, bone, metal, wall painting, architecture, etc., be expected to understand ceramic fabrics at a level that is currently expected among pottery specialists? Will becoming a ‘Jack of all trades, but master of none’ improve our overall ability to understand the archaeology and history of the ancient world?

    These questions are difficult to answer and in many ways require much more discussion and debate among archaeologists. There are palpable benefits to the approach espoused by Allison, but there is also risk that the skills of pottery and other specialists would erode if they were required or expected to become knowledgeable about numerous classes of archaeological material. Allison’s call for use of database management programs may provide the best answer for a compromise. The use of tablets, for instance, allows members of an archaeological project to access a variety of data, often updated in real time, that bring together disparate elements of a project into a more cohesive whole. Pottery specialists can quickly scan the details of an excavated deposit before reading the material. Excavators can assess the chronology of layers already dug to help them understand the stratigraphy of deposits while still in the field. Perhaps breaking down boundaries in Mediterranean archaeology should focus more on sharing information rather than blurring the lines between specialized knowledge. As a craft we have come to rely greatly on our degree of specialization. Other types of finds should be given more robust treatment, but this should not constrain the need for detailed analyses of ceramic assemblages. 


    Pottery specialists working in the Mediterranean are achieving a greater degree of professionalization as they develop more standardized protocols and methodologies. There is also a strong element of craft within the field of pottery analysis in Classical Archaeology. These elements are not mutually exclusive and their combination enables ceramic experts to provide robust data for use by other members of their projects. Further development of this craft is possible, particularly with respect to pottery analysts taking on a greater role as synthesizers. Shanks and McGuire note that ‘Craft is productive work for a purpose’ (1996: 78). Pottery analysis in Classical Archaeology adheres to this definition and, in its current manifestation, is successful at justifying its purpose.


    Allison, P.M. 1997. ‘Why do Excavation Reports have Finds’ Catalogues?’ In Not So Much a Pot, More a Way of Life: Current Approaches to Artefact Analysis in Archaeology, C.G. Cumberpatch and P.W. Blinkhorn (eds.). Oxford: Oxbow Books 77–84.

    Allison, P.M. 2004. Pompeian Households: An Analysis of the Material Culture. Los Angeles: Cotsen Institute of Archaeology, UCLA.

    Ault, B.A. 2005. The Excavations of Ancient Halieis, Volume 2. The Houses: The Organization and Use of Domestic Space. Bloomington: Indiana University Press.

    Costello IV, B. 2014. Architecture and Material Culture from the Earthquake House at Kourion, Cyprus (BAR Int. Ser. 2635). Oxford: Archaeopress.

    Deetz, J. 1977. In Small Things Forgotten: An Archaeology of Early American Life. Garden City, NY: Anchor Press/Doubleday.

    Felts, W.M. 1942. ‘A Petrographic Examination of Potsherds from Ancient Troy’. American Journal of Archaeology 46: 237–44.

    Peacock, D.P.S. 1970. ‘The Scientific Analysis of Ancient Ceramics: A Review’. World Archaeology 1: 375–89.

    Peacock, D.P.S. 1977. ‘Preface’, in Pottery and Early Commerce: Characterization and Trade in Roman and Later Ceramics, D.P.S. Peacock (ed.). London: Academic Press, vii–viii.

    Peacock, D.P.S. and D.F. Williams. 1986. Amphorae and the Roman Economy: An Introductory Guide. London and New York: Longman.

    Peña, J.T. 1999. The Urban Economy during the Early Dominate: Pottery Evidence from the Palatine Hill (BAR Int. Ser. 784). Oxford: Archaeopress.

    Peterson, S.E. 2009. Thin-Section Petrography of Ceramic Materials. Philadelphia: INSTAP Academic Press.

    Shanks, M. and R.H. McGuire. 1996. ‘The Craft of Archaeology’. American Antiquity 61: 75–88.

    Shepard, A.O. 1942. Rio Grande Glaze Paint Ware: A Study Illustrating the Place of Ceramic Technological Analysis in Archaeological Research. Washington, D.C.: Carnegie Institution of Washington.

    Juan Garcés (Digitised Manuscripts Blog)

    Languages in Medieval Britain

    We are proud to announce that the Catholicon Anglicum is now being exhibited in our Treasures Gallery. The British Library acquired the manuscript, the only complete copy of the text in existence, in February this year, for £92,500, following the temporary deferral of an export licence. It had lain hidden...

    Gabriel Bodard, et al. (Standards for Networking Ancient Prosopographies)

    Are you a prosopography?

    At the SNAP:DRGN project meeting in Edinburgh a few weeks ago, we decided on a couple of definitions that will impact on the ways in which partner datasets interact with the project. Our current thinking is that we need to distinguish between two kinds of data:

    (1) The first kind, which we’ll loosely call a “prosopography”, is a curated database of person records, with some ambition to be able to be used as an authority list. Prosopographies such as PIR, Broughton, PBW, etc. would be obvious examples of this category, as would the controlled vocabulary of persons in a library catalog like VIAF, Zenon, British Museum persons, Trismegistos Authors, the Perseus Catalog, etc. Even if the task of co-referencing persons is incomplete (as with Trismegistos, say), the intention to disambiguate qualifies the dataset as a “prosopography”.

    (2) The second, which we call a “list of attestations” is not comprehensively curated or disambiguated in this way, and has no ambition of being an authority list. Examples of this kind of dataset (as I understand them) would include: the EDH person table; the raw list of name references Mark has extracted from Latin inscriptions; the tagged and indexed “names and titles” in the texts of the Inscriptions of Aphrodisias or Inscriptions of Roman Tripolitania.

    In the SNAP:DRGN workflow, we hope that all “prosopographies” of type 1 will be contributed into the SNAP graph. We shall assign SNAP URIs to all persons in the datasets, and in time work to co-reference and merge with persons sourced from other projects as well as possible. These will form the authority file to which other datasets will refer, and we would recommend that lists of “attestations” of type 2 use Pelagios-style OAC annotations (*) to point to the SNAP identifiers as a way of disambiguating their person-references.  The process of disambiguating and/or co-referencing persons in this way might eventually lead some lists of annotations to become disambiguated prosopographies in our schema, at which point we would potentially want to include them in the SNAP graph as first class entities.

    (*) We hope to the have the SNAP:DRGN guidelines for these Pelagios-like annotations (“Scenario 5″ in our Cookbook) available very shortly.

    Archeomatica: Tecnologie per i Beni Culturali

    Pubblicata la normativa per il conseguimento della qualifica professionale di restauratore e collaboratore restauratore di beni culturali

    collaboratore-restauratore-bcA seguito dell'approvazione delle Linee guida applicative dell’art. 182 del Codice dei Beni Culturali e del Paesaggio è stato pubblicato in data 12/9/14 il bando per il conseguimento della qualifica di Collaboratore Restauratore ed è stata avviata la relativa procedura.