The wraparound, it's a full, year-long project where you take a class in the fall, you do a three-week field work in January, and then you come back for spring semester and analyze the data.
When you're in the field, you're doing your interviews, I want you to be focusing on what they say, verifying, validating what they say.
And so we have a cohort of students and they're not necessarily going to one place this year they were planning for three. So we had Nepal, Peru and Salinas. And they're trying to develop the tools they're gonna need to actually be down there. So that means they develop a survey, they meet with clients.
And so, that's a long process. In the meantime, we're learning about semi-structured interviews. We're learning about being in the field and having little impact other than getting the information we're getting.
A lot of my undergraduate courses were purely theoretical. So we talked about development, and how you can build up the middle class.
These are very abstract constructs, whereas here at the Middlebury Institute, they ground everything in experiential learning. So we not only theorized about development, we were actually able to go in the field, work with a local client, and see what it was like to actually design programming to solve real issues.
So in this case, our survey would inform their programming and waste management in the Dun region of Nepal. So it was very specific, it was very tangible. And in that sense it, gave us a better idea of what we could do as development workers.
When they come back, they're taking advanced policy analysis.
And we call it advanced because it's kind of the next level. You're not just learning about policy analysis, you're doing the analysis. They're taking all those data that they got, and they've got to clean that stuff up and they've got to get it ready for analysis and then analyze it, make sense of it, and then get a product out to their client.
This is working with real data, so it's not like one of those canned episodes where you say, okay here's this transcript of a pretend interview where you're gonna analyze it and try and see what you can get out of it. Instead it's messy, and there's a lot of things they didn't even anticipate, that they should have gotten but didn't.
And they're gonna rue the day that they ignored what somebody was saying, or they're gonna really wish that they just pushed a little bit harder and gotten ten more interviews or surveys. So they see the holes, and they see the things they would have liked to do. And they see where the tool fails.
And they'll see what they would have liked to do better the next time they go in. And so what they are doing is, they're getting a good look at why they designed the stuff the way they did in a field methods course, at the problems they had once they got into the J-term, and what they can still make of this stuff, even then.
For me it's really satisfying to see the data all the way through, to go from the design process to analyzing and then submitting a deliverable to our client. So not many universities have programs like this, and I think that's something that's unique to MIIS.
They're really preparing themselves to be future leaders in the field.
And we don't expect anybody to be a seasoned field specialist when they get out there. But they're a lot closer than they would have been if we had just done a regular research methods course.