Author Archives: nickhopwood

About nickhopwood

I'm a Senior Research Fellow at the University of Technology, Sydney (UTS). I am interested in learning and pedagogy, ethnography, practice theory (especially in relation to times, spaces, bodies, and things). I also blog about academic work, and in relation to research perspectives, methodology and design. Follow me on twitter @NHopUTS

Selective lowlights from my many rejections

This post relates to my rejection wall blog, tweet, and series of videos about rejection in academia.

My rejection wall is there for all to see – ‘all’ being people who happen to come to my office.

While there is fun to be poked at the ridiculousness of academic rejection, there is a serious point to it, so this blog post makes the details of my rejection wall available for all to see, and adds some commentary to it. This isn’t all my rejections, just some of the juicier bits – selected lowlights if you like – that I think might be the most rewarding for others to read. There’s other examples of people doing this in blogs and doing something similar in lab meetings (both of which I think are awesome!).

Maybe reading it just makes you feel a bit better by seeing how crap I’ve had it on several occasions with grumpy journal reviewers and months of grant-writing that came to nothing.

Maybe others can learn from my mistakes (and there have been plenty) – but I can’t promise that reading what follows will avert the inevitable misfortune of an unwarranted or nastily delivered academic rejection.

I have divided it up into rejected research proposals and rejected journal articles.

Some research rejections

Check out my video on why research grant applications get rejected, and then spot lots of those reasons in my failures below 🙂

Screen Shot 2018-06-05 at 4.07.06 am.png

It was clear that these reviewers really didn’t think our project should be funded. The language needs to be glowing to even stand a chance. Looking back at the proposal now, I can see where the reviewers were coming from. I think we suffered a bit from not having met regularly as a team to discuss things, and also a bit of group-think: we tended to see and say what we liked about what we were doing. There was no ‘red team’ asking the awkward questions. And I agree (now) that we framed the project around an issue that wasn’t obviously worth caring about in itself. And we didn’t make a good enough case for alignment in what we were proposing.

 

Screen Shot 2018-06-05 at 3.45.56 am.png

This was another big team effort. I think part of our problem was that the proposal swelled and got more complex as we fed in bits that represented what each of us offered (read: wanted to do so we all felt important). The team was very diverse – and we all felt we needed each other. None of us, nor any subset of us, could have made a case alone. But somehow it all became too much. Hence the relationship between parts being weak. The point about not offering clear understanding reflects this general problem, plus a second major weakness: we were not in the bullseye of what this particular funding round was about. We were not giving the funders what they wanted.

Screen Shot 2018-06-05 at 3.52.09 am.png

This was a proposal to a funding body specifically about womens’ safety. To this day I think our basic idea was a good one: to do a kind of action research. However, with reviewer comments like this, our proposal flew pretty wide of target. We went too heavy with unfamiliar theory and they couldn’t see how it would work in practice. They also couldn’t see how it would generalise. Lacking content was a big no-no – too much methodology. At the same time we didn’t give enough about the methods in terms of site details. And then we fell foul of the feasibility hurdle. So we misfired on multiple fronts.

Screen Shot 2018-06-05 at 4.02.14 am.png

Several more months’ of work down the tubes with this one! Among the many issues the reviewers found, the two above were the most catastrophic. Being ambitious is okay if your project looks really feasible and the reviewers don’t get lost in complexity. In this case we failed on both counts. And then we failed to make the case for adding to knowledge. Who in their right mind would fund something that wasn’t going to find something new? I still think the project as it existed in my mind would have been original and great. But what was in my mind clearly didn’t reach the reviewers. Finally the ‘reality check’ was a vicious blow! But pointed to how wrong we got it. The reviewer felt we had an over-inflated budget, to produce a measly evidence base that wasn’t going to reveal anything new. Brutal.

Screen Shot 2018-06-05 at 4.08.18 am.png

Ah – not giving concrete details about the methodology. That old chestnut. Old it might be, but a disaster for this funding proposal! I realise no-one is going to give out money – even for a study on a really important topic by brilliant people – if there isn’t a clear, feasible and persuasive business plan for how it is going to be pulled off (on time, on budget). The methodology section is key to this.

Screen Shot 2018-06-05 at 4.10.20 am.png

Again falling foul of methodological detail – in this case not explaining how we would do the analysis. The Field of Research point is really important – this is how these applications get directed to reviewers and bigger panels. We badged it as education but the readers didn’t see themselves or their field in what we proposed. I speak about getting the ‘wrong’ reviewer in the video about funding rejections.

 

And now some really lovely journal rejections

I’ve picked these to illustrate different reasons for getting rejected from academic journals – these connect with the reasons I talk about in a video focused on exactly this!

Screen Shot 2018-06-05 at 4.13.39 am.png

Ouch! This definitely wasn’t ‘a good paper went to the wrong journal’. The editors hated it and couldn’t even bring themselves to waste reviewers’ time by asking them to look at it. There was no invitation to come back with something better. Just a ‘get lost’. In the end the paper was published somewhere else.

Screen Shot 2018-06-05 at 4.18.36 am.png

This fell foul of the half-baked problem. Editor thought I was half way through. My bad for leaving him with this impression. Paper was published somewhere else without any further data collection / analysis, but with a much stronger argument about what the contribution was.

Screen Shot 2018-06-05 at 4.19.59 am.png

This was living proof for me that writing with ‘big names’ doesn’t protect you from crappy reviews. The profs I was writing with really were at the leading edge of theory in the area, and so we really did think it added something new. This paper was rejected from two journals before we finally got it published.

Screen Shot 2018-06-05 at 4.26.11 am.png

This is one of my favourites! This reviewer really hated my paper by the time she finished reading it. The problem was I got off on the wrong foot by writing as if the UK was the same as the whole world. My bad. Really not okay. But then things got worse because she didn’t see herself and her buddies in my lit review. All the studies she mentioned were missing were ones I’d read. They weren’t relevant, but now I learn to doff my cap to the biggies in the field anyway. How dare a reviewer question my ethics this way (the students involved asked to keep doing the logs as they found them so useful). How dare a reviewer tell me what theory I need to use? And what possible relevance do the names of her grand-daughter’s classmates have to my paper and its worthiness for publication?! Finally, on the issue of split infinitives, I checked (there was precedent in the journal). When this was published (eventually as a book chapter) I made sure there were plenty still in there. A classic case of annoying the reviewer who started with a valid point, then tried to ghost-write my paper the way she wanted it, and ended up flinging all sorts of mud at me.

 

Screen Shot 2018-06-05 at 4.22.00 am.png

The only thing I can say with certainty about this list is: it will get longer! I’ve published in quite a few of these too – showing a rejection doesn’t mean the end of the road for you and a particular journal.

More to follow (inevitably!)

 

Advertisements

Video on activist and change methodologies

I have been working with Ilaria Vanni on a new Designing Research Lab. We are approaching research methods teaching by conceptualising research approaches in terms of: deep dive, place-based, textual, and activist/change methodologies.

I made a short video summarising some key points. There are many different ways of doing activist or change-based research, and I don’t try to cover them all. After explaining some key ideas, the video focuses on action research, change laboratories, and practice-based approaches. These were chosen to illustrate different ways of going about research that can deliver change and create new possibilities for change.

You can download the PowerPoint used in the video here. Many of the images and speech bubbles highlight to other useful resources or original sources.

Some of the highlights in terms of framing ideas include these quotations:

“There is no necessary contradiction between active political commitment to resolving a problem, and rigorous scholarly research on that problem.” (Hale 2001)

“Activist research is about using or doing research so that it changes material conditions for people or places. It is different than cultural critique, where texts are written with political conviction, but no concrete changes are made on the ground.” (Hale 2001)

We are all participating in, and contributing to, the making of history and of our common future, bearing responsibility for the events unfolding today and, therefore, for what is to come tomorrow. The social structures and practices exist before we enter them… yet it is our action (or inaction), including our work of understanding and knowing, that helps maintain them in their status quo or, alternatively, to transform and transcend them” (Stetsenko 2017)

Let me know in the comments below – Are you involved in activist research? What other approaches are you using to deliver concrete change in the world?

The great wall of rejection

The saga of my #rejectionwall continues! The tweet that spurned 250,000 impressions and 1,000 retweets is still making new things happen.

The latest installment is here – from UTS’ U: Magazine.

It was preceded by my shadow CV and followed up with a post on another blog (It’s all pretty funny), a post as part of the ‘How I Fail’ series, a re-post  on an Indian website and two different pots on the UTS Futures Blog (one is a video interview, and the other a written reflection), a piece in Failure Magazine and then the interview and video for U: Magazine – produced by UTS, where I work.

The write-up brings together a few bits of the rejection story so far, and weaves them into the long and protracted story of rejections and failures in my career. There’s some new thoughts in there too.

What I really liked about it was that they got me and other academics to read out out rejections on camera. A bit like the mean tweets thing where celebrities read out insulting tweets about them, I found it really helped to step back, laugh, but also ‘own’ the rejection in a productive way. Kind of what the rejection wall did for me, but moreso.

I love the idea of a youtube channel just full of academics reading out their rejections, commenting on how unreasonable (or badly written, or unethical etc.) they are.

Anyone interested in making this a thing?!

How do I know I’m coding well in qualitative analysis?

Coding. Yay. Eek. Ugh.

Let’s face it, coding is a biggie. You don’t get far in the qualitative data analysis literature without seeing some mention of it. To be clear, this post does not assume coding is necessary in all qualitative data analysis. Nor does it assume coding amounts to qualitative analysis in the sense that coding is all you need to do. There is always an analytical residue – more interpretive work post-coding; in fact coding is often only a small part of qualitative analysis. Lots of analyses I’ve done haven’t used coding at all.

But coding can be incredibly valuable to us as qualitative data analysts. The problem is, it’s really easy to be busy coding but not to be doing so well. In this post I’m trying to spell out what it might mean to code well, and how you might know if you’re doing so.

 

Why code in the first place?

If you’re coding without knowing why, and without having made a deliberate choice to do so (rather than feeling you have to), it’s not a good start. Coding potentially serves lots of purposes, including but not limited to:

  1. Enabling you to retrieve chunks of data, or particular phrases, quotations etc, later when you need raw data in your writing, or if you want to check ideas that come up.
  2. Helping you ‘be with’ the data in a particular way, getting you up-close to the text.
    1. Maybe you might notice things in it that you haven’t seen before
    2. Maybe you might notice things important to the participants (but not originally to you)
  3. Lifting your ‘being with’ the data up a level to notice distinctions and associations (ie similarities and differences) at a high level of resolution
  4. Lifting your ‘being with’ the data up a level to notice where concepts or theoretical ideas might be manifest in concrete instances in your data
  5. Helping you develop codes, categories, or themes, that can become building blocks for subsequent analysis; You might compare or contrast these within or across cases, for example, or employ frequency counts.

 

What does coding well look like?

Coding is a slippery slope. I slid a long way down it several times, landing with a bump when I realised I’d been busy but uselessly so for several weeks. I forgot to keep these things in mind:

  1. Good coding relates to how hard you’re thinking (and helps you think harder). If you’re finding coding easy, or you’re not constantly having to make difficult decisions about what to name codes, how to code pieces of data, how big those pieces should be etc, chances are you’re not coding well.
  2. Good coding means you are seeing new things in the data and these new insights are progressive. Progression might mean enriching your argument (answer to research questions), or sharpening it (fixing in on what is essential, for example). These ‘new things’ could be new codes or categories or themes, but they could also be patterns, distinctions, associations, forms of significance, why things matter etc.
  3. Good coding settles towards a parsimonious set of codes/categories/themes. The best coding system is not the one with the most codes in it. An analyst who has created 10,000 categories has not done work that is 1,000 times better than the analyst who has created 10 categories. Chances are, the latter has been thinking much harder than the former as she goes. By parsimonious I mean strikes the optimal balance between power of explanation (persuasive, novel argument and insights) and complexity (number of ideas or building blocks in the argument). We can expect diminishing returns: adding five more codes or categories to a system that already has 50 probably doesn’t add as much as value as adding five to a system that only has two or three.
  4. Good coding opens up as much as it closes off: coding rarely, if ever, provides the answers to your questions. Rather it creates building blocks or thinking tools (and retrieval systems) that allow you to get closer to those answers. So good coding might open up by:
    1. Making new connections between parts of the data possible
    2. Making new distinctions between parts of the data possible
    3. Leading you to frame new questions that might specify how you will arrive at the answer to your big research questions
    4. Giving you units of data, concepts, ideas (and their inter-relationships) that you put to work in the next analytical stage.

But good coding isn’t purely expansive and generative. It also has to have boundaries and bring focus. So good coding might close off by:

  1. Helping you decide what data or concepts or categories to focus on, and which to set aside
  2. Consolidating what used to seem disparate or unconnected into coherent units that you can work with in whatever follows.

And this leads to my final point: good coding is a process that enables you to take further steps in analysis that wouldn’t have been possible without having done the coding. The codes are not the outcome (unless your research findings are going to be simply a matter of describing themes that come up in interviews, for example, which sounds terribly dull). If you can do the next step without more coding, perhaps it’s time to move on. If you can do it without the coding at all, why are you coding?

I would love to hear your experiences of coding – why do you code? When and how do you choose not to code, or to stop and move on to other analytical processes? How do you know you are coding well? Have you had experiences (like I have) when you’ve spent ages coding only to realise it hasn’t got you where you wanted to be?

Enhancing 1:1 research interviews: the secret power of the third thing

The one to one interview is a widely used means of generating data in qualitative research. It is a chance for a researcher to spend time exploring a participant’s experiences, practices, perceptions, stories, in detail.

I’d like you to imagine what this might look like. Perhaps you’ve done some interviewing yourself. Perhaps you are planning to do so. What will the set-up be? Here are some images I got from google that capture the sorts of practices I’m referring to.

The point here is the interview is constructed and conducted as a dialogue – a to and fro between two people. Generally one person (the researcher) is asking questions about the other (the participant). It is a dyadic interaction.

My argument is that interviews are better conceived and done as triadic interactions: between the researcher, the participant, and something else.

P4.jpg

Becomes

Screen Shot 2017-08-01 at 9.37.01 AM

Before I go further I need to lay out two assumptions:

  1. Interviews give direct evidence of what someone says in response to questions they are asked in particular circumstances. Nothing else. They are not a magical process that gives direct evidence of what people think or feel. Whom is asking them, what they are being asked, where and when this is happening all contribute to the way in which responses (ie. Data) are constructed within the context of situated social interaction.
  2. Good interviews help people construct useful answers. Useful has lots of dimensions – an element of ‘truth’ or at least genuineness is important, but also detail, relevance, clarity etc. We are not in the business of discovering what is in people’s heads here (at least not in the way I’m thinking about research interviews).

So the X on the diagram above is there to show that data come from interaction between you, the interviewee and a third thing.

This third thing, when used in particular ways, has amazing magical powers.

What am I talking about? What is this third thing?

The third thing can be an object or idea (ie concrete or abstract), but to have these magical powers it has to change the structure and function of the interaction.

A list of questions that the researcher has in her hands does not count. A digital audio recorder does not count. A cup of tea for each of you does not count. These things are useful but they do nothing in themselves to shift from a dyadic to triadic way of conducting the interview.

The objects or ideas that work as magical third things do so by changing the scenario from one in which the researcher asks the interviewee about herself to one in which the researcher asks the interviewee about the third thing.

If we wanted to know what a teacher thinks about, say, teaching in schools, we could ask “What do you think about ensuring accessibility for all learners?”. The question is aimed at the person, directly.

Instead, we could show a photograph of a classroom, or a video, or have one of the teacher’s lesson plans or some examples of resources she has used in her lessons. Then we could ask the interviewee to comment on those things. The interviewee can look at them, perhaps even pick them up.

We have changed from a question that follows a path directly from the researcher to the interviewee and back, to one that goes via a third thing.

Other examples could be diaries, concept maps, small cards with different words or pictures on them, computers or tablet devices, other relevant documents, artefacts people have produced or used – the list is pretty much endless. The magic is not in the thing itself, but in how it is used.

As I mentioned, some of these third things might not be concrete, tangible objects. They could be ideas that are invoked through the way the question is put together. Let me explain with an example.

I’ve been interviewing a lot of parents recently, people who have been experiencing significant difficulties, often for reasons beyond their control. When I was piloting, I found that the question “What do you think your strengths are as a parent?” didn’t produce very good answers. Duh. Why would it? These were vulnerable people who were often self-judging as failing.

Instead I starting asking questions like these: “If I asked your partner what he thinks your strengths are as parent, what would he say?” or “If I asked someone who knows what you’ve been through and knows you really well, what would she say has enabled you to get through it all?”. These questions still take an indirect route (ie via the top of the triangle on the diagram above), but this time the third thing is invoked in an imaginary way.

Why is this indirect, triadic way so valuable?

  1. It helps participants pause before answering. Other things (like cups of tea, biscuits etc) do too, but the point is at least when the third thing is a physical object, people can comfortable entertain silence while they think for a moment. I’ve found even the intangible versions work similarly too: if you ask someone about themselves, there’s this expectation they should know the answer. If you ask them about someone else, it seems more permissible to take some time to think. And generally in interviews, silence is golden! (the other important silence is the one you, the researcher, leave after the answer, but that’s probably for another blog post!).
  2. It is less confronting. Not X asking about Y, but X asks Y about Z. A very useful shift, particularly when we are talking about sensitive issues.
  3. It helps construct responses that are closely tied to concrete examples, giving rich empirical detail not generalised, vague abstractions.
  4. It is based on Vygotsky’s theory of learning (the principle of mediation of activity through tools and signs). There’s shelves of books on this. I’m not going to explain it here, other than to note that the idea does have sound theoretical basis.
  5. It helps balance pre-determined structure and emergence in the interview: the third things can shape (suggest direction, boundaries) but not fix the way the interview goes.

I’m not claiming these ideas are particularly new. People use them all the time. But I haven’t read much about them in the textbooks, at least not explicitly framed this way.

Finally, here are some brief comments from Sally Irvine-Smith, a wonderful doctoral student here at UTS. She has been working with interviews on these principles and kindly offered to share some of her experiences. Thanks Sally!

I am adopting a practice approach which focuses very strongly on what people ‘do’. My study which is about decision makers in the local sphere. One group of participants were involved in a community panel to decide how to spend a certain amount of money for a local council. They were given a large folder of documentation and I asked them to bring that along to the interview where we jointly examined it. Interestingly, although they all (but one) had quite an affection for the folder as a memento of their time on the panel, it was not particularly important to them as a source of information. The remainder of my participants were elected members  or council officers. I did something a little different in their case: I asked them to think of a decision they were making and I treated that decision as the object in the interview. I encouraged the participants to examine their decision as if it had material form, to discuss its genesis and outcome, to describe who helped to shape it, and how it transformed and developed over time. I haven’t written up any of my results yet, but my data analysis indicates that the results from this technique are rich and provide an authentic picture of what my participants actually do when getting information for their decisions. 

 

How might we respond to rejection in academia?

It seems the #rejectionwall is the gift that keeps on giving! The original tweet has made over 225,000 impressions, and there have been a few visitors to my office to ‘admire’ the collection.

I was contacted by @EvanGomes_ who wanted to do an interview – about the academic rejection wall but also about rejection and failure more generally. His write-up is available here.  I was also interviewed over email by Veronika Cheplygina for her ‘How I Fail‘ series, which prompted me to think further about some aspects of rejection, failure and their connections.

There was part of the interview with Evan that didn’t make the final cut. It was about how I think we cope with rejection and failure. Evan’s post does include a bit where I talked about how we might “encourage people to be confident in seeking help, rather than having to rely on yourself to dig deeper”.

Here is the point I want to make in this post: when someone is faced with a difficult circumstance, I don’t think it is necessarily fair, or good enough, to say the response should be ‘toughen up’, or ‘grow a thick skin’. Yes, there is an aspect of personal quality in how we feel (or allow ourselves to feel?) wounded and hurt when we are rejected. But I don’t think dismissing these feelings or trying to delegitimise them is the way to go. From yoga I’ve learned about acknowledging responses and then coming back to a different position from which options of what next look different. Being thin-skinned doesn’t make academic life and its inevitable rejections easier. But there’s a difference between denying the feelings of pain, hurt, shame etc, and acknowledging them but working on not letting them take overwhelming control of us.

This is where my hero comes in. Vygotsky tells us that as human  beings we have a really special capacity: to control ourselves (our behaviour, our minds) from the outside in. What this means is that instead of having to somehow tap into mysterious inner reserves (grittiness, thick skins, whatever), we have a huge, diverse, set of culturally available tools (both physical objects and those we use symbolically, including language) to help us do this work. A Vygotskian response to rejection could be understood as looking to this external environment as a means to shape our reaction.

That could be sticking the rejection up somewhere public (like an office door), so that you enrol others in responding to it with you, confirming that there is no shame, even if you feel it. It could be getting out a black marker pen and deleting the nasty, stinging comments from reviewers that are neither helpful nor justified. It could be finding a friend to go for a walk or drink with – someone who knows even more about rejection and failure than you. It could be some embodied practices like running, swimming, long deep breaths.

The point is, instead of the person in the difficult situation being asked to dig deeper (and therefore constructed as both the victim and the villain if they fail to do so successfully), that person looks outside themselves for assistance in determining their response.

To be clear, I’m not saying we have to share our rejections publicly. Many won’t want to do so; many will feel vulnerable doing so (despite the experience being universally shared among academics). That’s fine. I am saying it is a tall order to expect of ourselves, or others, to cope and develop the best possible response based on these mysterious, intangible and poorly defined inner qualities alone. So much better to draw on the cultural legacy of human history and start to take control from the outside in. For this, I thank Vygotsky for his most amazing insights.

My wall of rejection and why it matters

See the tweet and all the comments about it here!

If I had a magic wand and could change something about academia, I would make it commonplace for people to share their rejections – on blogs, by emailing colleagues, by running to their office neighbours, print-out in hand, saying “You won’t believe how awful the review I got this morning was! Come and laugh at it with me over coffee!”. I’d love for our workplace walls to be covered with juicy rejections.

#rejectionwall #failurewall #rejectionisnormal #noshame

I recently updated my shadow CV and this got me thinking about rejections. The topic came up last week when I was sat with three very highly respected female professors. The four of us shared our battle scars together, almost competitively one-upping each other: “you think that rejection was bad, mine was worse!”. It turned out three of us had actually been rejected after having papers accepted (as an editor, I didn’t even know ‘unaccept’ was a button you could press in the system!).

The conference that brought us all together had a strong theme relating to materialities, and I started thinking about the materialities of rejection: or, in fact how hidden away academic rejections are from public view, how often they remain in the private digital aether. I made the decision there and then to tear down all the copies of publications that were currently festooned on my office door. After all, while I felt good coming past them each day, it probably didn’t have the same effect on my colleagues. I realised, perhaps  *little* late, that my successes are public enough. People have no problem accessing the ‘Nick is awesome’ version of my career; it is even foisted on them at times without them having to look. What was clearly needed was a visceral, material reminder, an exposé of my many journal rejections, failed research grant applications, and missed job opportunities. Yes, these are in my shadow CV, but that itself is shadowy: only accessed if you know to look for it.

There are some wonderful examples of people sharing rejections (see this fantastic blog by the always awesome Pat Thomson, for example). But I still worry this stuff is too hidden from view.

Ta dah! Here is my new wall of rejection – there for all my colleagues and visiting students to see. I intend to keep it there, and keep adding to it as the rejections flow in.

 

Why am I doing this?

One responder to a tweet in which I shared a similar picture, @drlizziewho asked: Why do you do this? Good question.

I was simply amazed by the response to the tweet. Over 90,000 impressions, and nearly 700 retweets in the first 15 hours (unprecedented in my contributions to the tweetosphere). People commenting seemed to be from two groups:

  1. Students and early career researchers, who took solace in realising rejection affects us all, is normal, and is nothing to be ashamed of; the value here was that rejection doesn’t mean you’re not good enough, but this message isn’t communicated very often
  2. More experienced researchers who wonderfully acknowledged their own rejections. I’d like to quote a few of them here and thank them for joining the fun

@Liam_Wagner: I think I will plagiarise your idea and cover my office wall with my own list.

@StephenBHeard: You might like my job-rejection list (an awesome blog that develops the theme of the shadow cv)

@SimmsMelanie: I love telling people that I got rejected from my own journal – more than once

@RoseGWhite: I’m sure I could cover a whole corridor like this!

@JRobinHighley: I would start my own display, but not sure I have a wall big enough

@TrevorABranch: My wall is not big enough

@naynerz: If that were my office, not enough all space for my rejections LOL

@mathewjowens: I’m tempted to do this for grant rejections. Though I fear for the deforestation effect

@SJC_fishy: I would need a much larger wall

@RobHarcourt: So would I!

 

And so it goes on… I simply love how the tweet has prompted those of us who have enjoyed some successes to relish in sharing our less fortunate moments.

 

@SiouxsieW asked: Does it not depress you seeing that every day? My answer is no! Not at all. It helps to project me from being wounded when the rejections come (and they are definitely coming!) – by keeping me real and helping realise despite all the rejections in the past, I’m still doing okay – indeed I’m doing better every year (though this doesn’t mean the rejection rate goes down). I also admit it gives me a buzz to think the wall, or pictures of it, are perhaps helping others in some small way.

 

Exposing our rejections is not just important, but necessary in my view, for these reasons:

  1. If we don’t do so, we collude in producing a half-truth about academic life and careers: it’s like hiding all the out-takes.
  2. It’s not just about fun and laughing with (not at) others. The point is that research, careers, publications are not smooth; their journeys into the light of success are bumpy, full of dead ends and disasters. We have to come clean that this is part of knowledge production.
  3. Research would suggest that rejections don’t affect everyone the same way. It’s easy enough for me, with a full time, ongoing job, to brush off a rejection and keep going. It’s not the same for people whose positions are less secure, or whose immediate futures relied on that grant or article getting through.
  4. The professors I was talking to commented that there might be a gender dimension in how we respond to and are affected by rejections. Not that all women respond one way and all men another, but that historically, perhaps the publicity around male success and continued disproportionate representation of men in leadership positions generally, might mean that rejections can ‘bite’ women in particular ways.
  5. There is a pedagogy here – not only normalising rejection, but also potentially modelling ways to deal with it. I’m no masochist. I don’t find rejection fun. I fear rejection. Of course I do. Everything I’ve had rejected has mattered to me, reflected hours of work and emotional input. But I don’t let fear of rejection stop me from trying in the first place. And I don’t let the experience of rejection prevent me from keeping going.

So, here is a really serious call for help:

If you’ve had a rejection, or a whole pile of them, please share with us! Maybe publish your shadow CV, or take a picture of your own #rejectionwall – or do something else creative! Maybe write and tell me what you and colleagues are doing to normalise rejection and build pedagogies of how to deal with it.

 

STOP PRESS! I’ve been (joyously) overwhelmed by the response to the tweet and blog. Here are some links to things I’ve received from people 🙂

Peggy Blair’s blog in which she shows off her rejections from the publishing industry

An article about another Shadow CV – this time from a really big prof!

A heads up about a paper that will be coming out in the Professional Geographer journal, about failure in academia (Thom Davies et al) – will update with more info when it comes out!

And the first person to share their #rejectionwall – thanks a billion @AlexaDelbosc

AlexaDelbosc rejection wall