Useful Idea in Study and Research

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

Tranquil

Full Member
10+ Year Member
Joined
Jul 27, 2010
Messages
201
Reaction score
0
In research is is obviously important to get the right people on your team and I thought I would start this thread off and think about how you look at or present yourself in a CV and interview. Too often when you look up sites on how to do a CV they are a bit mechanical and don't always explain the foundations of a good CV. So Bill Hybels in his book "Courageous Leadership" marks out three elements he always looks for when employing someone and they are in order of importance: character, competence and chemistry.

Character - committed to: honesty, teachability, humility, reliability, persistence, punctuality, kindness, work ethic and a willingness to be entreated (told off or advised). Remember, lapses or failings in character are very hard to fix and almost impossible to do in an active team setting.

Competence - look for high skills in the area you want, go after proven competence. If someone is unhappy or if unemployed then wonder why.

Chemistry - a relational fit with other members of the team, someone who has a positive emotional effect on the team - put simply it helps if you like each other in a team.​

You have to take a good look at yourself when considering these qualities. You will know if you are honest, you will know if you have been lazy and uncaring, you will know if you have cheated or been unkind. Hybels said "It's a terribly lonely feeling to have no one to blame, look to no one to rescue you. It's rotten to realise that to find the bad guy, you just have to look in the mirror. The truth is that the only person who can sort you out is you. Anything else is a self-leadership fumble, an illusion."

In your CV you are trying to persuade someone of your worth and foremost in that is demonstrating your character. The value of character has always been recognised from the modern world where Billy Graham said "When wealth is lost, nothing is lost; when health is lost, something is lost; when character is lost, all is lost" to ancient times with Aristotle, some 2,500 years ago saying "Character may almost be called the most effective means of persuasion."

Now of course on a CV you cannot write down "I am honest and reliable..." you have to demonstrate it, produce evidence for it. The way you do that is to say something about how you reacted or behaved in real circumstances. Suppose you worked in Burger King or a local Church you might say something like:

Worked in Burger King for 18 months. During this time I learned the essential value of team work and how important punctuality was coupled with helping each other during busy periods....

I worked as an intern in my local Church for year and this taught me the value of meeting and talking and caring about people. I also learned how to operate the sound and vision system and then went on to mange wedding and funeral service and out of that how to help and be sympathetic to the needs of others....​

In this way you tell people what it is you have learned and how your character has developed as well as giving an indication of your skills and of course you have also said something about chemistry. Very very likely, the interview panel will follow this up and again it's an opportunity for you to show who you are and what you are really like. Now don't just copy what I have written because if you have any real character you will think through every job you have undertaken (no matter how small) and ask what it was you learned from it.

Members don't see this ad.
 
Thanks for the good idea about study and research. Keep posting...
 
This is the first of a couple of posts on the notion of Common Sense. For those who want to go further I recommend the excellent book by Duncan Watts called "Everything is Obvious" (its also available as a eBook). It is a relevant idea in research because far too often researchers elevate it making as Rousseau said "their own tiny brains the measure of all things". In short jumping for 'its only common sense' might get you up the proverbial gum tree as well as in medical practice, put a lot of lives at risk.

Common Sense
Roughly speaking, it is the loosely organized set of facts, observations, experiences, insights, and pieces of received wisdom that each of us accumulates over a lifetime, in the course of encountering, dealing with, and learning from, everyday situations. Beyond that, however, it tends to resist easy classification. There are perhaps two defining features of common sense that seem to differentiate it from other kinds of human knowledge, like that found in science or mathematics.

Practical - The first of these features is that unlike formal systems of knowledge, which are fundamentally theoretical, common sense is overwhelmingly practical, meaning that it is more concerned with providing answers to questions than in worrying about how it came by the answers. From the perspective of common sense, it is good enough to know that something is true, or that it is the way of things. One does not need to know why in order to benefit from the knowledge, and arguably one is better off not worrying about it too much.

Deals with things in its own terms - The second feature that differentiates common sense from formal knowledge is that while the power of formal systems resides in their ability to organise their specific findings into logical categories described by general principles, the power of common sense lies in its ability to deal with every concrete situation on its own terms. For example, it is a matter of common sense that what we wear or do or say in front of our boss will be different from how we behave in front of our friends, our parents, our parents' friends, or our friends’ parents. But whereas a formal system of knowledge would try to derive the appropriate behavior in all these situations from a single, more general “law", common sense just “knows” what is appropriate.​

In my next post I will elaborate on this and show that the notion of common sense can become unstable or even dangerous when we move it out of our day to day world, for which it is ideally suited, and try to use it to solve all and every problem.
 
Members don't see this ad :)
Some Further Observations on Common Sense
The fact that what is self evident to one person can be seen as totally silly by another person should make us pause about the reliability of common sense as a basis for understanding the world. How can we be confident that what we believe is right when someone else feels equally strongly that it’s wrong - especially when we can’t really articulate why we think we’re right in the first place without relying on dogma (its right because I say it is) or authority of some kind? Of course, we can always write them off as crazy or ignorant or stupid or something and therefore not worth paying attention to. But once you go down that road and we start to think for ourselves, it gets increasingly hard to account for why we ourselves believe what we do.

So if something that seemed so obvious turned out to be wrong, what else that we believe to be self-evident now will seem wrong to us in the future? Once we start to examine our own beliefs, in fact, it becomes increasingly unclear even how the various beliefs we espouse at any given time link together. Common sense, in other words, is not so much a world view but a rag bag of logically inconsistent, often contradictory beliefs, each of which seems right at the time but carries no guarantee of being right any other time.

How does common sense take us so far in the fragmented, inconsistent, and even self-contradictory of our everyday lives? The reason is that everyday life is effectively broken up into small problems, grounded in very specific contexts that we can solve more or less independently of one another. Under these circumstances, being able to connect our thought processes in a logical manner isn't really the point.

Just a simple example, the electron can be regarded as a minuscule billiard ball for some calculations and a wave, a ripple for others. But common sense would say that is totally absurd, how can one thing be two different things at the same time? In short we must be wary of extending common sense beyond the everyday where it reigns supreme and therefore begin to think that our tiny minds, particularly our common sense, is the measure of all things.
 
Keep these coming. Excellent advice.
 
Before we get to our next note on common sense it is necessary to discuss the notion of certainty.

Be cautious with Certainty and Assurance.
One can often get into a position of certainty over your own knowledge; a kind of assurance that you have finally got there. Now this sounds wonderful and indeed it is nice to feel that you know and can do something well. The trouble is it can shut your mind down so reflecting from the standpoint of certitude allows no new meaning, no deeper understanding no surprises to emerge, indeed if you are certain you will implicitly tell yourself more or less that reflection is pointless because there is nothing new for you to learn.

It follows, that certitude may simply reinforce the way things are emotionally and intellectually so in that mindset reflection can become stale and unappealing and we rely on current experience and perspectives and so frustrate movement toward insight, or to put it more bluntly when something new comes along we “don’t want to know”. To counter this you need to always be on the lookout for new ideas and insights, they will not always be obvious, you do not have to swallow them wholesale but you do have to honestly chew them over, you have to be aware, seeing them as precious gifts aimed just at you, in this way you grow continually in knowledge, intelligence and emotional awareness and share what you have with others in your learning community.

Doubt
It is tempting to avoid the idea of doubt because it can have negative connotations. But it is a way of thinking that is to be cherished because doubt, when you are not sure, drives you on to seek information and struggle until that doubt is removed – that is creative doubt. Doubt therefore is what brings you eventually to the truth, the answer. One might usefully recall what Dostoevsky in the Brothers Karamazov said “Without criticism there'd be nothing but Hosannas. But man cannot live by Hosannas alone, those Hosannas have to be tempered in the crucible of doubt..”

I will have more to say on this, but often we say to ourselves 'its common sense' and that invariably is dangerous because we tend not to look at it closely and just assume that our common sense will not let us down - a kind of intellectual arrogance that implicitly says 'I cannot be wrong but other are'. So beware. Finally, human beings have an inbuilt sense of logic but in practice that logic does not always work the way we classically define logic. You might see this easily if you remember the last time you bought something on impulse and you 'bent' the logic to justify it. Indeed, it is thought that human logic, or you can say your mind, works more like quantum mathematics where things act in what seem to be weird and inexplicable ways. So again be aware that YOUR logic, your mind may in fact be just that 'YOURS'.

If any one would like to explore this latter point it just so happens there is an article in New Scientist for September 3rd, 2011, Volume 211 No 2828 Page 34 called "Your Quantum Mind".
 
Social learning
It is better sometimes to learn from others than to muddle along on your own although no one quite knows how it works but in essence we copy behaviours from what we see around us or perhaps from what we read. However, if we only copy there is a catch because we need innovation to help us cope with change - one cannot copy everything blindly because the information may be wrong, outdated or unavailable. Possible models for social learning might be the:

Conformist Transmission Model - where we copy what is common not what is rare or to put it more simply we as it were follow the crowd.

Copy an expert – this is or can be an excellent strategy; because one hopes that by doing this you feel confident that you are learning the best practice or most relevant and current knowledge.

Copy the most successful – here you might look around and follow those who appear successful and whatever they do therefore might well be good for you. There are obvious dangers here, for example because the latest ‘celeb’ endorses diet X implies nothing about their actual practices or knowledge?
Social learning is in some way underpinned by an implicit trust in others. However, this has its own difficulty perhaps best illustrated by the famous Prisoners Dilemma which shows how in certain circumstances what happens when members of a group trust each other; they can choose a course of action that will bring them the best possible outcome for the group as a whole. But without trust each individual may well aim for his or her best personal outcome - which can lead to the worst possible outcome for all.

In the Prisoner's Dilemma two participants as prisoners who have been jointly charged with a crime (which they did commit) but questioned separately. The police only have enough evidence to be sure of a conviction for a minor offence, but not enough for the more serious crime. The prisoners made a pact that if they were caught they would not confess or turn witness on each other. If both prisoners hold true to their word they will only be convicted of the lesser offence. But the dilemma occurs when the police offer each prisoner a reduced prison term if they confess to the serious offence and give evidence against the other prisoner. This sounds like a good deal, confess and you get the minimum possible term in jail - although your partner will get the maximum. But then you realise that if both you and your partner confess then both will be given the maximum term in prison. So the dilemma is whether you trust your partner to keep quiet - and if you do, should you 'stitch them up' to get out of jail quicker?​

It is easy to see from the above how in group leaning you may sadly always find there are individual who sponge on the group, lurking in the background scooping up what others have done but adding little or nothing themselves. Therefore with the above dilemma in mind we may state broadly three ways to learn:

Innovate by individual learning – this means you have to put in the hard graft and in so doing produce something new; not necessarily intrinsically new but you have uncovered the knowledge or worked hard to acquire the skill by your own dedicated and persistent effort - this perhaps is the most rewarding way to learn and has the most lasting benefits.

Observation - acquire new learning by social learning, implying there is a sense of trust from and toward you and a sharing in some sense of the burden involved. It is worth saying here that this form of leaning may easily become total exploitation where you take but give nothing.

Formal Teaching – one must not forget the role of formal teaching where there is an intensive effort to pass on skills and knowledge in a defined setting.​

In all learning you must take time out to rest and reflect and just let the learning ‘sink’ in. Research suggests that this might be up to 1/5 of the available time thus you space out learning by thinking about pay offs or tradeoffs. One final point is that in social learning there is a kind of parasitic dimension because eventually you run out of things to copy and then someone has to do the hard graft to gain new skills or knowledge that can then be copied. It follows it only pays in the long term to do social learning if there are some innovators around

Finally, when you copy, other individuals have probably filtered the stuff for you so you have to weight up the relative costs and benefits of sticking to a behaviour you have or inventing/copying a new one. As humans of course we are aware of how quick information gets outdated or a skill lost or no longer needed but you can look to the future, talk about what might happen and consider consequences.
 
Errors in Common Sense - There are three types of error (see Post 3 for Book ref)

1. Our Mental Model of individual behaviour is systemically flawed - When we think about why people do what they do, we invariably focus on factors like incentives, motivations, and beliefs, of which we are consciously aware. But this view of human behaviour is only the tip of the proverbial ice berg. For example, it may not occur to us that music playing in the background may influence our choice or even as simple a thing as the font in which the cover of a book is written my dispose up to buy it.

Often therefore, we don’t factor these apparently trivial or seemingly irrelevant factors yet they do matter. The trouble is, it is probably impossible to anticipate everything that might be relevant to a given situation. The result is we make mistakes and we are likely to make even more serious mistakes when predicting how other people might behave anywhere outside of the immediate here and now.

2. Our mental model of collective behaviour is defective - the basic problem is that whenever people get together for whatever reason from dinner parties to sharing rumours and generally influencing one another’s perspectives about what is good and bad, cheap and expensive, right and wrong, these influences pile up in unexpected ways, generating collective behavior that is “emergent” in the sense that it cannot be understood solely in terms of its component parts.

Faced with such complexity, or “influencers" our explanations of collective behaviour paper over most of what is actually happening. Whenever something interesting, dramatic, or terrible happens-Hush Puppies become popular again, a book by an unknown author becomes an international best seller, the housing bubble bursts, or terrorists crash planes into the World Trade Center-we instinctively look for explanations.

3. In common sense reasoning we use less from history than we think we do - the misperception of the past skews our perception or predictions about the future. Whenever something interesting, dramatic, or terrible happens-Hush Puppies become popular again, a book by an unknown author becomes an international best seller, the housing bubble bursts, or terrorists crash planes into the World Trade Centre - we instinctively look for explanations.

Key Thought
Moreover, because we only try to explain events that strike us as sufficiently interesting, our explanations account only for a tiny fraction even of the things that do happen. The result is that what appear to us to be causal explanations are in fact just stories - descriptions of what happened that tell us little, if anything, about the mechanisms at work. Nevertheless, because these stories have the form of causal explanations, we treat them as if they have predictive power. In this way, we deceive ourselves into believing that we can make predictions that are impossible, even in principle. They create an illusion of understanding where we have papered over events with a plausible-sounding story. Common sense is wonderful at making sense of the world but not necessarily at understanding it.

The cost, however, is that we think we have understood things that in fact we have simply papered over with a plausible-sounding story. And because this illusion of understanding in turn undercuts our motivation to treat social problems the way we treat problems in medicine, engineering, and science, the unfortunate result is that common sense actually inhibits our understanding of the world.

The main point, though, is that just as an unquestioning belief in the correspondence between natural events and godly affairs had to give way in order for “real” explanations to be developed, so too, real explanations of the social world will require us to examine what it is about our common sense that misleads us into thinking that we know more than we do.

Implications
People digest new information in ways that tend to reinforce what they already think. In part, we do this by noticing information that confirms our existing beliefs more readily than information that does not. In part, we do it by subjecting disconfirming information to greater scrutiny and skepticism or tacitly ignoring it as obviously wrong than confirming information. Together, these two closely related tendencies-known as confirmation bias and motivated reasoning respectively, greatly impede our ability to resolve disputes, from petty disagreements over domestic duties to long-running political conflicts.

Even in science, confirmation bias and motivated reasoning play pemicious roles. Scientists, that is, are supposed to follow the evidence, even if it contradicts their own preexisting beliefs; and yet, more often than they should, they question the evidence instead. The result, as the physicist Max Planck famously acknowledged, is often that “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die.
 
Shared Knowledge and Experiences
Learning is not private, there is a whole community that you can access; a community that values learning and where you can give and take. There is a vast amount of information literally at your finger-tips and its all more or less free, a kind of gift to you from Socrates, Newton, Hawkin, Darwin, Churchill, Feynman, Turing, Von Neumann, the Beatles, Mozart, Bobby Moore, the list is endless and awe inspiring when we think of such giants gifting their thinking to you! Sharing is a vital aspect of learning. One might recall what the Dali Lama said “share your knowledge; it’s a way to achieve immortality”.

In some communities some books or authors are regarded as off limits but this I think is unwise, if you cannot listen to the arguments then you cannot understand them, refute them or even accept them. If you limit your learning in this way then you actually may well let serious error in. In life when we come across something new we can rationally do one of several things:

1. Modify/update in some way what we already know in the light of new information.
2. We can accept it as completely new information and just add it into our store of knowledge.
3. We can accept the new and that may mean we have to totally throw out something we though previously was sound.
4. Lastly you can examine the new and reject it as unsound.
Of these item 3 is probably emotionally the most difficult to come to terms with because we may well have invested a lot of effort learning something - but it must be done if we are to move forward. Obviously in technology this particular item is a more or less everyday occurrence although it happens in all branches of knowledge where a new bit of information 'forces' you to see a new interpenetration or shows that your current understanding is faulty.​

Although this is hard there is nothing worse than showing your ignorance - for example, I was once at a conference and a speaker was presenting a paper on what is called SSM and it was going quite well until one section and almost everyone in the room then knew he was way out of date - don't let that happen to you. Sadly, the medical profession is replete with examples where practitioners have become locked into a particular mindset and virtually refuse to move on.

You should always discuss what you find with almost anyone who will listen but obviously it is far better to discuss it without someone who actually knows what he or she is talking about - always be wary of people who tell you what you should or should not study when they themselves are not experts.

If we examine the latest research in social grouping of various kinds, you will learn that we are all influenced by those around us and also more surprisingly those around them. It’s as if a virus is let loose so good humour, kindness and a shared attitude can spread through a group just as much as grumpiness, hate, selfishness and unkindness. So be mindful of your social grouping and inject good things into it and its more or less certain you will be rewarded. Go where the knowledge is and it need not be in your current community but above all don't feel nervous about this but instead feel excitement and wonder. Finally two quotes, the first a warning and the second a shaft of light:

Francis Bacon - The human understanding is not composed of dry light, but is influenced by the will and the emotions, a fact that crates fanciful knowledge, man prefers to believe what he wants to be true

Jacob Bronowski - Knowledge is an unending adventure at the edge of uncertainty
 
thank you for this post , i am a chinese medical student ,in this post i learned more new views ,it is very useful for myself,please keeping....
 
Situation Factors and Relevance
It seems clear that what is relevant about a situation is just those features that it shares with other comparable situations - for example, how much something costs is relevant to a purchase decision because cost is something that mostly matters whenever people buy something.

However, determining which features are relevant about a situation requires us to associate it with some set of comparable situations, but the trouble is determining which situations are comparable depends on knowing which features are relevant. This inherent circularity poses what is called the frame problem (think of it as gathering all the relevant information inside a frame). The frame problem was first really noticed in artificial intelligence, when researchers tried to program robots to solve supposedly simple everyday tasks like cleaning a messy room since humans do it everyday; how hard could it be? Very hard indeed, as it turned out because on closer inspection there are literally thousands of interacting factors which vary from day today and room to room which humans take in a glance.

Having said that, when confronting a particular situation, our brains do not generate a long list of questions about all the possible details that might be relevant. Rather, we simply plumb automatically and usually unconsciously our extensive database of memories, conditions, symptoms, images, experiences, cultural norms, and imagined outcomes, and seamlessly insert whatever details are necessary in order to complete the picture.

In everyday situations this might be fine but outside that frame we might get it completely and utterly wrong because in this process we may insert details that may not be true for a particular situation. For example, students asked about the colour of a classroom blackboard recalled it as being green (the usual colour) even though the board in question was blue. In cases like this, a careful person ought to respond that he can’t answer the question accurately without being given more information. But because the “filling in" process happens instantaneously and effortlessly, we are typically unaware that it is even taking place; thus it doesn't occur to us that anything is missing, so the frame problem should warn us that when we do this, we are bound to make mistakes, and we do it all the time.

In fact, we make the mistake again and again but sadly, no matter how many times we fail to predict behaviour or events correctly, we can always explain away our mistakes in terms of things that we didn't know at the time, so sweeping the frame problem under the carpet, always convincing ourselves that this time we are going to get it right, without ever learning what it is that we are doing wrong. Indeed the more ingrained our belief system the more we are disposed to be uncritical and ignore or explain away everything that questions it. It is this difference between making sense of behavior and predicting it that is responsible for many of the failures of common sense reasoning. And if this difference poses difficulties for dealing with individual behaviour, the problem gets only more pronounced when dealing with the behavior of groups or nations.

For those who want to go further I recommend the excellent book by Duncan Watts called "Everything is Obvious" (its also available as a eBook)
 
You may find this interesting on the theme of this thread and it has been copied from New Scientist 15th May 2010. Interested to hear what you think or perhaps you have examples? The point about this list is that we all need to be aware of these essentially dishonest tactics and not get taken in by them and instead vigorously oppose them, indeed if you look though this board or others you may spot some of these tactics being used, it will not be easy to see them unless you are on your guard and are prepared to find out the truth. Be aware that we all have a tendency to lean towards things we want to be true and might sweep under the carpet our doubts but that is a destructive and disreputable route so don't take it.

May Martin McKee, an epidemiologist at the London School of Hygiene and Tropical Medicine who also studies denial, has identified six tactics that all denialist movements use. "I'm not suggesting there is a manual somewhere, but one can see these elements, to varying degrees, in many settings," he says (The European journal of Public Health, vol 19, p2).

1. Allege that there's a conspiracy. Claim that scientific consensus has arisen through collusion rather than the accumulation of evidence.

2. Use fake experts to support your story. "Denial always starts with a cadre of pseudo-experts with some credentials that create a façade of credibility," says Seth Kalichman of the University of Connecticut.

3. Cherry-pick the evidence: trumpet whatever appears to support your case and ignore or rubbish the rest. Carry on trotting out supportive evidence even after it has been discredited.

4. Create impossible standards for your opponents. Claim that the existing evidence is not good enough and demand more. If your opponent comes up with evidence you have demanded, move the goalposts.

5. Use logical fallacies. Hitler opposed smoking, so anti-smoking measures are Nazi. Deliberately misrepresent the scientific consensus and then knock down your straw man.

6. Manufacture doubt. Falsely portray scientists as so divided that basing policy on their advice would be premature. Insist "both sides" must be heard and cry censorship when "dissenting" arguments or experts are rejected.
 
To have an open mind means to be willing to consider or receive new and different ideas. It means being flexible and adaptive to new experiences and ideas. People who are open-minded are willing to change their views when presented with new facts and evidence. Those who are not, and are resistant to change will find life less rewarding and satisfying, not to mention dull. If we limit ourselves to what we know and were more comfortable with in the past, we will become more and more frustrated.

If we choose to approach life in the same way day after day, as well as becoming bored and uninspired, we will reduce our intellectual aptitude. If, on the other hand, we seek new ways of doing and looking at things, we will expand our intellectual capability, find life more exciting, and broaden our experiences.

Most people agree that open-mindedness is one of the fundamental aims of education, always elusive but eminently worth pursuing. It is the childlike attitude of wonder and interest in new ideas coupled with a determination to have one's beliefs properly grounded.

Bertrand Russell regarded open-mindedness as the virtue that prevents habit and desire from making us unable or unwilling to entertain the idea that earlier beliefs may have to be revised or abandoned; its main value lies in challenging the fanaticism that comes from a conviction that our views are absolutely certain, that WE are right.

'You are obstinate, he is pigheaded, I needless to say, merely hold firm opinions.' This is Russell's memorable way of making the point that it is enormously difficult to recognise one's own tendencies towards closed-mindedness. We see ourselves as eminently reasonable, and our views as open to discussion, even though it may be perfectly clear to others that we are only going through the motions of giving a serious hearing to a rival view because then we can pick holes in it.
 
Members don't see this ad :)
A tip for those at College or University, particularly if part of your course means doing a presentation or project or dissertation, is about asking questions and how important that activity is for the opening of one's mind and intellectual development. Some people are afraid of doubt, particularly people of a religious persuasion, preferring absolute certainty; but God has not made us like that. Doubt can be creative because it forces us to ask questions and seek the truth. Just think what the world would be like if no one had persisted and overcame their doubts and doggedly pursued them through questions; no penicillin, no electricity, no Internet and so on.*

I preface all these remarks by the best advice I ever got as a student from Prof Jacob Bronowski who said "it is important that students bring a certain ragamuffin, barefoot irreverence to their studies; they are not here to hero worship what is known but to question it". This advice is essentially about the joy of finding out it's not about you being ignorant and insulting your teachers or just arguing for arguments sake.

Gregory Neal - I love questions. I love asking them and I love being asked ... if nothing else, they keep the brain juices flowing! No one should ever be afraid of questions; God certainly isn't afraid of questions, and neither should we be afraid of them. If the beginning of wisdom is fear of the Lord – and it is – then the next step on the path to wisdom is being willing and able to say: "I don’t know." Only God is omniscient, and until we are willing to admit that we don’t know something, we’ll never be able to learn anything new. Think about that!

I love being asked questions that stump me. Why? Because those kinds of questions make me dig, study, and learn something I didn't know before. Even if I don’t find a satisfactory answer, it’s not a loss; I’ve opened my mind and learned something new. And that’s the essence of wisdom: being open to learning something new. The instant one closes their mind and refuses to learn something new, or worse, say there is nothing new to learn or thinks they already know "it" all – is the instant when they begin to die.

Lloyd Alexander - we learn more by looking for the answer to a question and not finding it than we do from learning the answer itself.

Charles Steinmetz - there are no foolish questions, and no one becomes a fool until you stop asking questions.

Albert Einstein - the important thing is not to stop questioning. Curiosity has its own reasons of existing. One cannot help but be in awe when one contemplates the mysteries of eternity, of life, of the marvellous structures of reality.​

One final point, asking questions also means seeking the answers yourself from books, from experimentation, from research and asking other people, but don't start by asking other people begin this study by being persistent yourself. Often you will ask other people and they may give you an answer and some will become very annoyed if you do not accept it or want to explore it more deeply. Don't be put off by this and at the same time don't deliberately upset people who are trying to help you. It is VITAL to your intellectual development to only ASK other people when you have exhausted your own resources. If as soon as you get stuck you ask you will NEVER learn the persistence and struggle to own something you yourself have found.

One final warning is to be careful about using the Internet (but don't stop using it it) as all sorts of people with all sorts of motives post all sorts of things and usually there is no way for you to check the sources are reliable. There are also a large number of discussion boards like this one and often you will see answers to what are complex questions but again in general you cannot be certain of the reliability - that means of course you have to question and consider what I have said here not blindly accept it.
 
Last edited:
To give some context, think of yourself as writing a literature review and so you need to be aware of many things and that awareness should make you careful how you use other peoples work and how you construct and include thoughts of your own. With this in mind consider the following ideas.

Common Knowledge - If something is described as “common knowledge” it implies that many, if not most people know it. Such information does not belong to anyone person and it cannot normally be deduced, rather it has to be learned. It is probably talked about in several sources: the world is round; computers contain processors and memory, nothing goes faster than light are all examples of “common knowledge”. If it is common knowledge, you do not need to cite a source. Be careful because some authors will write down in their own work things that are common knowledge. In such cases, quoting them in that instance amounts to saying that a bit of what is common knowledge actually belongs to that author and that of course is an absurdity.

It is possible to have common knowledge within a particular domain, such as might be found in medical practice and the same ideas apply though one needs a little more care than for what one might call "everyday" common knowledge.

Obvious Knowledge - If something is described as “obvious” it implies that most people know it. Such information does not belong to anyone person but it can be discovered rationally or empirically. It is probably talked about in several sources: companies tend to grow; when the sun goes down it gets dark are all examples of “obvious knowledge”. If it is obvious you do not need to cite a source. Be careful because again some authors will write down the obvious in their own work. Quoting them in that instance amounts to saying that what is obvious to everyone actually belongs to that author and that of course is an absurdity.*

It is possible to have obvious knowledge within a particular domain, such as might be found in medical practice and the same ideas apply though one needs a little more care than for what one might call "everyday" obvious knowledge.

Published Knowledge - Published knowledge refers to ideas and information that is found in a specific primary source which is not common or obvious knowledge but is nevertheless useful; in these cases you must always cite the source. By primary sources we usually mean journals, research reports, government papers, etc and books are not classed as a primary source.

Original Knowledge - In any work you may include freely original ideas of your own. However, be aware that if a reader uncovers an idea that is not cited, is not common knowledge nor obvious, then they are entitled to believe that it is a new idea from you. If in fact this is not the case, then you will have plagiarised it, which is a serious academic offence. It follows that if you are introducing an idea of your own you should make it clear by the way it is presented that this is indeed your own work.*
Identifying Plagiarism
Scholarship is about showing your understanding and criticism of ideas. It should be obvious that simply copying, paraphrasing or summarising, although it can show a limited understanding, fails to show any ability to criticise. You must "add value", that is make your own contribution to knowledge and you can usually only do this by both expressing published ideas in some way for yourself, mingling them with your own thoughts and ideas and perhaps primarily, subjecting them to serious and sceptical inspection.

Plagiarism is stealing ideas, even if you express the idea in your own words, (which is often good scholarship) you may still be guilty of plagiarism if you do not credit the source. Anyone can copy and paste a phrase, sentence or paragraph and cite its source and technically this is not plagiarism, but it's often very poor scholarship since it is obvious that such an activity tells us nothing about the learning, if any, which occurred. For example and to exaggerate a little, supposed one copied in its entirety an essay from another student, giving it full attribution and then submitted it as your own work to the tutor – it is obvious this is not acceptable and it would be regarded as plagiarism because the perpetrator has done nothing that is unquestionably theirs. The same thing would apply if you copied into your work a quotation for example but then failed to introduce it or make any comments about it or show its validity in a given argument. Set against all that is a principle of usage of other people work and usage is the key to avoiding plagiarism – it was written by Goethe many years ago.

"What is there is mine, and whether I got it from books or life is of no consequence, the only point is, whether I have made a right use of it."​

If plagiarism occurs in your work, it will not matter if you say it was accidental, or you were just careless or you did not know how to do quote and cite correctly. It will still be regarded as a serious offence. It is hard to say exactly how to avoid plagiarism but in general the best way is to read widely, especially scholarly books and then you will gradually learn how it is done and that is in many ways this is the best teacher. Therefore, plagiarism is about two related things

Deliberate - attempting to make readers believe that what is presented is the student’s own work; cheating.
Ethics - trying to pass off another author’s work as if it was your own is both cheating and highly disrespectful and implies that you as a student do not know right from wrong.
Plagiarism can be hard to prove even when it appears obvious. Most Universities and Journal editors have software that will use a copy of any written work you supply and search for its sources and then reports what it finds giving percentages and exact locations as well as highlighting any copying found and then it's up to the tutor to decide whether the usual scholarly conventions have been abused.*

Because plagiarism is regarded as so serious, many if not most Universities define a simple test which them amounts to proof of wrong doing. The test is usually something as simple assaying that if 10 or more words are copied or paraphrased without attribution, then plagiarism is proved and the relevant work is automatically referred to an investigative panel and serious sanctions or punishments may well be the outcome.
 
Last edited:
You may find this extract from something in New Scientist Useful. Its not specifically medical but it has huge significance for managing anything.

Explaining the curse of work (see http://www.newscientist.com/article/mg20126901.300)

Here is another perspective on leadership and working relationships and its about the famous Parkinson's Law. It is 1944, and there is a war on. In a joint army and air force headquarters somewhere in England, Major Parkinson must oil the administrative wheels of the fight against Nazi Germany. The stream of vital paperwork from on high is more like a flood, perpetually threatening to engulf him. Then disaster strikes. The chief of the base, the air vice-marshal, goes on leave. His deputy, an army colonel, falls sick. The colonel's deputy, an air force wing commander, is called away on urgent business. Major Parkinson is left to soldier on alone. At that point, an odd thing happens - nothing at all. The paper flood ceases; the war goes on regardless. As Major Parkinson later mused: "There had never been anything to do. We'd just been making work for each other."

That feeling might be familiar to many working in large organisations, where decisions can seem to be bounced between layers of management in a whirl of consultation, circulation, deliberation and delegation. It led Major Parkinson - in civilian dress, C. Northcote Parkinson, naval historian, theorist of bureaucracy and humourist - to a seminal insight. This is "Parkinson's law", first published in an article of 1955, which states: work expands to fill the time available for its completion. Is there anything more to that "law" than just a cynical slogan? Physicists Peter Klimek, Rudolf Hanel and Stefan Thurner of the MUV think so. They have recreated mathematically the kind of bureaucratic dynamics that Parkinson described anecdotally 50 years ago. Their findings put Parkinson's observations on a scientific footing, but also make productive reading for anyone in charge of organising... well, anything.

Parkinson based his ideas not just on his war experience, but also his historical research. Between 1914 and 1928, he noted, the number of administrators in the British Admiralty increased by almost 80 per cent, while the number of sailors they had to administer fell by a third, and the number of ships by two-thirds. Parkinson suggested a reason: in any hierarchical management structure, people in positions of authority need subordinates, and those extra bodies have to be occupied - regardless of how much there actually is to do.

Parkinson was also interested in other aspects of management dynamics, in particular the workings of committees. How many members can a committee have and still be effective? Parkinson's own guess was based on the 700-year history of England's highest council of state - in its modern incarnation, the UK cabinet. Five times in succession between 1257 and 1955, this cabinet grew from small beginnings to a membership of just over 20. Each time it reached that point, it was replaced by a new, smaller body, which began growing again. This was no coincidence, Parkinson argued: beyond about 20 members, groups become structurally unable to come to consensus. A look around the globe today indicates that the highest executive bodies of most countries have between 13 and 20 members. "Cabinets are commonly constituted with memberships close to Parkinson's limit," says Thurner, "but not above it." And that is not all, says Klimek: the size of the executive is also inversely correlated to measures of life expectancy, adult literacy, economic purchasing power and political stability. "The more members there are, the more likely a country is to be less stable politically, and less developed," he says.

Why should this be? To find out, the researchers constructed a simple network model of a committee. They grouped the nodes of the network - the committee members- in tightly knit clusters with a few further links between clusters tying the overall network together, reflecting the clumping tendencies of like-minded people known to exist in human interactions. To start off, each person in the network had one of two opposing opinions, represented as 0 or 1. At each time step in the model, each member would adopt the opinion held by the majority of their immediate neighbours. Such a process can have two outcomes: either the network will reach a consensus, with 0s or 1s throughout, or it will get stuck at an entrenched disagreement between two factions. A striking transition between these two possibilities emerged as the number of participants grew. Groups with fewer than 20 members tend to reach agreement, whereas those larger than 20 generally splinter into subgroups that agree within themselves, but become frozen in permanent disagreement with each other.

One curious detail in the computer simulations was that there is a particular number of decision-makers that stands out from the trend as being truly, spectacularly bad, tending with alarmingly high probability to lead to deadlock: eight.

Where this effect comes from is unclear. But once again, Parkinson had anticipated it, noting in 1955 that no nation had a cabinet of eight members. Intriguingly, the same is true today, and other committees charged with making momentous decisions tend to fall either side of the bedevilled number: the Bank of England's monetary policy committee, for example, has nine; the US National Security Council has six. So perhaps we all subliminally know the kind of things that Parkinson highlighted and the computer simulations have confirmed. As Parkinson noted, we ignore them at our peril. Charles I was the only British monarch who favoured a council of state of eight members. His decision-making was so notoriously bad that he lost his head.

Bibliography
Parkinson's Law, or The Pursuit of Progress by C. Northcote Parkinson (Pub. Murray, 1958)
Parkinson's Law by Leo Gough, (Pub Infinite Ideas Ltd) ISBN 978-1-906821-34-0 (this is a modern day evaluation and guide)
 
Emotional Intelligence
This is an area of interest at present because it offers a way for individuals to become more aware of themselves and others and hence make them in some sense more competent at their job. It is not entirely useful to try to define EI but Eaton and Johnson suggested it might be summarised as “the ability to inform our decisions with an understanding of our own and others’ emotions so that we can take productive action”.*Obviously then EI might be very useful for medical practitioners in dealing with patients and other staff.

EI literature talks about various emotional competences but we must not take this too far as then we end up as some sort of robot stoic, those who have no emotions at all, cannot feel another’s pain, or sympathize with another’s predicament, feel love or hate, joy or misery; who wants to be around people like that? The essential point is that emotionally we are all different and that is a strength; the fact is we all have emotional defects/strengths of one sort or another and we cannot always get rid of/do not want to get rid of them but we can be AWARE of them and in some sense manage them and recognize them in others. For what it’s worth the literature usually defines 5 competences in this area but they are all essentially premised on the idea of a deep sense of self-awareness. These competences can be regarded as variables in human behaviours.*

Self-Regulation - the management and control of one’s impulses and resources so as to regulate one’s self against impulsive actions, delaying instant gratification in order to remain focused.

Self-Awareness - or consciousness/sensitivity to our own emotional states and intuitions leading to recognition of their limitations and paradoxically therefore maximizing strengths.

Motivation – loosely these are emotional tendencies that facilitate the achievement of goals or you might think of it as a way of focusing internal energies and impulses on a mission to achieve excellence though any presented opportunity coupled with a considered inclination to exploit them.

Empathy – strictly, this is to feel another’s pain by attuning our emotions to those of others so as to derive the knowledge and understanding of how and why other people feel, act and react the way they do in given situations, particularly when significant stress is involved such as might occur in manymdoctor/patient situations.

Social Skills – enables the individual to “read” the intentions and actions of others and so adjust to or influence the operational ethos of groups so fitting into the mood, atmosphere and trust of the other team members.​
 
1. Work at the bits you can understand and outward to the other more difficult bits

2. If you can work part of it out you can work it out everywhere

3. Try to find every possible flaw, get rid of the ego fear related to making mistakes. Learn from mistakes and critiques because they are like signposts to where to go next.

4. If you get off track stop and think it through how best to get back on track - take the initiative. So make a plan and do it to a fixed time plan - forget about how you feel, stick to the plan

5. Relish the things you struggle with, these are the things that once mastered move you onward and upward. Struggle means progress - you get smarter. Look for harder problems or make simpler ones harder. Never be willing to give up.

6. If you don't struggle to deal with problems but almost immediately ask someone you will never get beyond where you are now, the very act of struggling strengthens your intellectual powers.

7. Look for key ideas and express them in your own way then look for links between them.

8. Stop and think, trouble often come because we do not stop and think it through.

9. Master the basics first.

10. Take breaks regularly and when you are stuck. The fact is your brain uses much more energy when you are just day dreaming that when you are problem focused. It's as if the brain during day dreaming is sorting itself out. This may be the reason why often after taking a break one gets a breakthrough.*
 
The Deficit Model of Communication
This model of communication assumes that mistrust of unwelcome scientific (or any other) findings stems from a lack of knowledge. Ergo, if you provide more facts, scepticism should melt away. This approach appeals to people who are trained to treat evidence as the ultimate arbiter of truth. The problem is that in many cases it does not work.*Perversely, just giving people more information can sometime simply polarise opinion and cause sceptics to harden their line. Better communication strategies are needed.

Be aware that everyone filters and interprets knowledge through their own cultural perspectives and these perspectives can be more powerful than be facts. Cultural perspective also explain why some people prefer anecdotal rather than hard evidence.

Education can strengthen cultural bias because this encourages mirroring of the view of ones own cultural group but now tinged with arrogance over what one knows - its a kind of triangulation so you end up with the 'correct' view or opinion.*It follows to communicate we must identify in some way with a target audience.*

Framing a message in relation to a particular cultural bias. That is to go beyond the facts so that people see a context. For example, we might frame climate change data in terms of economics, environment, public health benefits etc. Look carefully at how the facts are presented: text, charts graphs etc against the audience.*
 
An interesting article appeared this month in Scientific American, June 2012 Volume 306 Number 6. pages 56-69. Called "The right way to get it wrong". Here is a summary.

Georges Clemenceau Once said "A man's life is interesting primarily when he has failed - I well know. For it's a sign that he tried to surpass himself."

It is easily understood that science places a premium on being correct. Of course, most scientists - like most human beings, make plenty of mistakes along the way. But not all errors are equal. History tells us that there have been many instances when an incorrect idea proved far more potent than thousands of others that were trivially mistaken or narrowly correct.

Failures, if you let then, are like signposts that tell you where to go next. Unfortunately, we are creatures of ego and don't like to own up to failure and so don't follow the signpost and in so doing we may well lose significantly.

Niels Bohr, for example, created a model of the atom that was wrong in nearly every way, yet his work inspired the quantum-mechanical revolution. In the face of enormous scepticism and some abuse, Alfred Wegener argued that centrifugal forces made the continents drift along the surface of the earth, however, he had the right phenomenon albeit with the wrong mechanism. Enrico Fermi thought he had created nuclei heavier than Uranium, rather (as we know now) having stumbled on nuclear fission.*

Another interesting case was that of Nick Herbert who in 1981 designed a faster than light communications system but according to Einstein's theory of relativity, such a device could not exist. But what is interesting here is that at first no one could find anything wrong with it, several reviewers thought the system watertight yet they knew it must be wrong but simply could not figure out what was wrong. In time, and after close study it was realised Herbert's error was that he had assumed elementary particles can be copied in a particular way that proved to be false. Nevertheless, Herbert's work was exploited by physicists because of the insight it brought to make crucial advances in quantum information science.

A biological example would be that of Delbrück who did experiments on viruses and found that infection by one virus strain prevented infection by another - this was not expected and he was able to see that the expected one-to-one correlation between viruses and genes that Delbrück had envisioned unraveled. He then devised what turned out to be a landmark study using what he called a 'fluctuation test' and this opened up the study of bacterial genetics and he would go on the win a Nobel prize.

So don't get all depressed when a failure occurs but work it through, see what went wrong or what mistakes you made; there may just be a really important insight waiting for you. While striving to be correct, let us pause to admire the great art of being productively wrong.
 
Access the Learning Community
We are surrounded by a learning community: fellow students, teachers, family and we may extent this to authors via books and other resources. So access it. Share what you have and others will share and help you. The learning community is not for those who want always to take or copy the hard work of others but it is for genuine sharing and support. Learning can and should be a shared experience and often in a group we find that people have slightly different perspective or ideas and this can be a means of opening up a subject area and your mind to deep learning because in essence the problem solving power of the group is greater than that of a student working alone.

It goes almost without saying that sharing in a group is a great social experience and as you share you grow and you also share in one another’s success. Julia Chrysostomides, who died in 2008 was an outstanding scholar in Byzantine history and she (taught by no less a luminary than Iris Murdoch) called this learning community a “brotherhood of scholars“ and so emphasised the family like nature attached to fellow learners and like any family it’s for life. If you really engage with learning you will not only grow intellectually and open up your mind but you will make friends for life in the process. Good fellowship of this nature is like a virus, it spreads rapidly through the group and out of the group.

So find people to talk to seriously about your subject but don't neglect to talk to those in other areas because often ideas can transfer. It really is amazing how vibrant a research community can be but it does take effort and time - everything has a cost.
 
Knowledge – what is it?
This is a difficult topic and philosophers for many centuries have wrestled with the idea but loosely it comes about in two ways: by perception (when we see) and by reasoning (when we think). Plato suggested that in order for something to count as knowledge at least three criteria must apply: a statement (or you can say action) must be justified, must be true, and believed although we do also require that the statement was not arrived at through a defect, flaw, or failure (Blackburn) in which case of course it was not in fact true. However there is general agreement that the following ideas are useful.

Tacit - this is knowledge that more or less equates to experiences or in simple terms knowledge that you cannot write down in order to pass it on. So it cannot be learned directly from a book it must be practiced and developed by use. For example, it might be possible to read a description of how to ride a bike but you will never have real knowledge of that until you master it yourself by riding a bike.

Implicit– this is rather like tacit knowledge; it is something like instinct, intuition or vibes, we just somehow know something and we did not explicitly learn it or practice it.

Explicit – this is knowledge that is easily (in principle) shared as we have a common language medium by which to disseminate it. That is you can pick up for example a book and gain knowledge about Anglo Saxon's in England. Notice that one is not learning a skill here as there is no practicing of the knowledge as such.​

It is also possible to come more or less to the same conclusions by thinking about the notion of knowing. So one for example can get a book or several books and learn all there is to know about cystitis. But there is a second kind of knowing that only comes through action, in this case seeing hundreds of patients and treating them. To put it in a slightly more humorous way I don't think anyone would be happy visiting the surgery if the doctors told them they had read a book on their condition but had no practical experience of diagnosing and treating it.
 
Last edited:
Betrayal of Your Teachers
I wonder if you find this heading puzzling. One might also ask what you think the sign of a great teacher is supposed to be? Some years ago Peter Rollins in his book ‘The Orthodox Heretic’ lucidly explained what great teachers want more than anything for their students; his answer is both startling and profound. All great teachers want to nurture their students in order that they may surpass them but for this to happen an uncomfortable separation must take place at some point between the student and the teacher. An authentic teacher is one who at some point asks his students to prove their devotion by finding their own way, to stop following them and move beyond the lessons and wisdom learned from them to finally take responsibility for their own future path.*

This is a strange paradox, for it is only by following the teacher that one will take any notice of his injunction not to follow. Yet these words, fully grasped, set the student free, not to forget his learning but to apply it in ever new and innovative ways. This is not a betrayal in the sense of a rejection or blind fidelity that seeks to live by the letter of the law laid down by the teacher. Rather, this enthusiastic move beyond the teacher in response to the teacher can be described as a faithful betrayal.

The great teacher is one who says, "You will do greater things than I" All teachers stay only for a season so that their words act as bridge to learning, not a blockage to it and so that their iconic presence does not morph into an idolatrous one. Strangely then, a total and complete fidelity to a teacher, an unthinking devotion to his teaching, will always end up being nothing but a betrayal of all they worked for.
 
Some research heresies!

I expect you are all familiar with the Scientific Method; which is powerful and well-tried and has yielded notable successes down the years. But if one is not thoughtful it can get you into a kind of mental rut. The most obvious situation where this tramline thinking is your undoing is when you get stuck, you hit a brick wall and don't know what to do next. So some ideas to stop you thinking in a sort of tramline manner and help you see other possibilities. Indeed, one of the most powerful ways of solving a problem is to cultivate a habit of looking for new or other viewpoints.

For example, suppose the problem is how to get as many cars in the car park as is possible but you want a new way of looking at this problem, perhaps because you are stuck or you just feel the solution is not right or not the best. Well one easy way is to invert the question, so instead of asking how to get more cars in, ask how to get fewer. Now this might sound silly but because it forces you into another viewpoint it just might generate a new idea. There are a few ways of changing your view point which can be done individually or in groups rather like brainstorming though with a little more focus.

WARNING 1 - the following are unusual ideas but that does not mean they imply a licence to do anything you like. Just because you get what sounds like a good idea is not an authority to go and try it out in the nearest patient!

WARNING 2 - when you try the things below have a pencil and paper handy to make note as you go along, short notes because you don't want to hamper the flow of ideas. If you don't do this you will lose 75% of what is generated.

You will almost certainly find in a group setting that there will be those who vigorously object to this unstructured approach. In many ways these objections are useful because they generate discussion. So if there are objectors, start by asking what exactly their objection is - if their answers are of the sort: 'this will never work', 'it silly', 'it's unscientific', 'it's dangerous' and so on then they are saying essentially nothing. As my old teacher used to say to our class when anyone didn't want to work 'here's the telephone, now ring your mother and tell her that you want to go home because there is nothing for you to learn'.

Invert the Question - this often brings a lot of new ideas because it's such a dramatic way of dealing with a problem.

Background - suppose you are an eye doctor then look at the problem as if you were a knee specialist or computer engineer or your grandmother or anyone. The whole idea is to nudge you out of your comfort zone and start a new train of thought. In fact, difficulties in research are often associated with people being too specialised, thinking they are a real expert, and they simply end up recycling what they already know and it is unlikely that will get you a solution to all or even most problems.

Gut feeling - when you get stuck, think about the blockage and then look around the room or lab or wherever you are and pick something, anything that somehow attracts you - a TV set, a coat someone has on, a painting on the wall, the shape of someone's nose, anything. Once you have your image - lets say it is a yellow shirt. Now say to yourself, "Why is problem X like this yellow shirt?" Believe me, it is often amazing what comes out of such an exercise.

You can do the above in a group, ask everyone to think about the problem and link it with a object, real or imagined. Tell them to write it down and then let everyone explain why they think the particular image they have cooked up is like the problem.

Tom Sawyer - this rather odd title comes from the book of the same name. If you remember the story, Tom had to paint the fence around Aunt Polly's house and he managed to find novel ways of doing it, not only at no cost to himself but at a profit. So if your problem is about how to do something then think of new ways of doing it, don't dismiss anything no matter how absurd it sounds. Just as an exercise, think about all the ways one could paint a fence. If you think there is only one way, then you are definitely stuck in your ways and unless you open up will stay there.​

So try these and try any other way of changing your viewpoint, it won't always work but it will move you forward. Just as an aside, the whole world of comedy works on changing your viewpoint. Now some more general thoughts.

Churchill - was alway on the lookout for what he called 'corkscrew' thinkers, those who could think the unthinkable, the silly, the impossible, the unusual. His reasoning was that he realised Hitler thought down tramlines and if you thought the same way he would be hard to defeat. As we now now, the British intelligence service outwitted the German war machine because of often the most outrageously bizarre ideas that ended up working.

Andre Geim - won Nobel Prize in Physics for 2010 (with Konstantin Novoselov) at the University of Manchester for groundbreaking experiments regarding the two-dimensional material graphene. Now Geim is also famous for the way he works and thinks - two particular elements are worth mentioning.

Magazine Approach - he advocates moving a little outside of ones discipline and just creating a general interest and awareness of what others do. For example, one can read every week a magazine like New Scientist because there are articles on all sorts of things and every now and again an article outside you subject area will give you an idea. You can also make sure you interact with those in other areas and see what they do and talk to them about it - this is really useful so don't stick too tightly to your own colleagues.

Saturday night experiments - this really means that if you think up some idea that you allow yourself to test it no matter how weird or bizarre it seems but without committing huge and expensive resource to it. The philosophy here really is that you allow your research team members some flexibility and not try to control or manage their every thought and action. The term comes about because equipment and lab space is often unused at some point during the week so giving you space to try out things. Of course you can always construct thought experiments and it's good to do that to keep things turning over.​
 
Last edited:
I thought it might be helpful to have a general note on statistics so an overview prepared as a summary of a BBC TV programme called "The Joy of Statistics" presented by Professor Hans Rosling, which hopefully you can still find and enjoy on the Web somewhere.

Purpose - Statistics can tell us whether what we think and believe is actually true or not - are men better drivers than women, are Germans taller than Frenchmen, is drug A better than drug B etc. All we need is the possibility of defining and getting relevant data - however, this is often far from easy, for example, it is easy to define and get data on infant mortality but what data could we define to show that God exists?

We mentioned purpose above and this is not always clear to the researcher or the consumers of the results themselves, Professor Rosling joked that "a man remonstrated with his member of parliament telling him that unemployment was up 13%, pay is down 5%, suicides have increased by 7% so why is the government wasting money collecting statistics." It is in fact surprising how little accurate information any person has about the world they live in. One might think that highly educated people know more but often what they know is no better and sometimes worse than what a simple guess might yield.

Statistics comes from the word 'state' and some scholars pinpoint the origin of statistics to 1663, with the publication of Natural and Political Observations upon the Bills of Mortality by John Graunt. However, it's modern form began about 200 years ago in Sweden in 1749 with the creation of the Office of Tables (Tabellverket) for the systematic collection of national statistics and it was the first time that any government could get an accurate picture of it's peoples. One interesting find by the Office of Tables was that the population was 2 million not the 20 million that everyone thought!! After the shock of this finding the Swedish government took action and improved health care for example because it now knew where mortality was occurring. So knowing the statistics one can exercise control - interestingly in it's early days is was also known as 'political arithmetic'.

The Average - In the early years huge amounts of data where collected and shown in tables but it was soon realised that to make sense of it the data had to be analysed and presented in better ways than just long tables of numbers. One of the first tools was the average so that a whole mass of data is reduced to a single number that characterise a whole population. For example, in the UK the number of people who die in traffic accidents is almost a constant from year to year so here we have a case where a statistic describes a social phenomena. But averages don't tell you the whole story and one can get weird results such the fact that Swedish people on average have slightly less that two legs (average number of legs 1.999) so most Swedish people have slightly more that the average number of legs. This oddity arises because some people have one leg and some have none but no one has more that two legs.

Variation - So we need to look at variation in the data and to get a handle on that we need to look at shapes and these shapes typically show how the data varies around the average. As these shapes were explored one shape turned up again and again because it fitted so many sets of measurements that Sir Francis Galton, a polymath named it the 'normal curve'. In time other distributions were discovered related to social or other processes:

Continuous Distributions - Normal Distribution, Uniform Distribution, Cauchy Distribution, t Distribution, F Distribution, Chi-Square Distribution, Exponential Distribution, Weibull Distribution, Lognormal Distribution, Gamma Distribution, Double Exponential Distribution, Tukey-Lambda Distribution, Beta Distribution etc

Discrete Distributions - Binomial Distribution and Poisson Distribution​

Data Patterns - So shapes show patterns in the data but also are a communication tool to a wider public and these days its almost an art form for we can use different shapes, colours and animations to tell the story. If the story in the numbers can be told by a beautiful and clever image then everyone can understand them. One of the pioneers of statistical graphics was Florence Nightingale who was herself a passionate statistician and she once famously said that "to understand God's thoughts we must study statistics for these are the measure of his purpose" so for her statistics was a religious duty.

She went to the Crimean was and for two years she recorded mortality in meticulous detail and gathered her data in a devastating report forcing the government to set up a committee of enquiry. What created her place in statistical history is the graphics she used and one in particular, the polar area graph where she showed deaths from wounds, death from accidents and deaths from preventable causes and her graphics were so clear as to be unmistakable that they led to a revolution nursing care and hygiene in hospitals world wide.

Correlation - The next step after seeing in statistics what is happening we try to find out why using the powerful idea of correlation, how things causally vary together, meaning there is a definite link of cause and effect - crime correlates to poverty or infection correlates to poor sanitation but correlation can be very very tricky. For example, we might find a correlation between size of shoe and intelligence but it is extremely unlikely there is a causal link.

One of the most famous cases was the investigation by Dr. Richard Doll in the 1950s of a causal link between lung cancer and smoking. The work was difficult of course because Doll had to show there was no other factor involved and that is much harder than you might think but he in effect used an RCT (Randomised Controlled Trial) by looking at those who smoked and got lung cancer and this who did not smoke but still got lung cancer. He also looked at those who stopped smoking and calculated their reduced risk of getting the disease.

Cautions - When therefore one comes up with a correlation we must not stop thinking but try as hard as we can to disprove it by looking for other possibilities, explanations of cause and effect or getting more data or both - if it withstands all those efforts at refutation then cautiously we might say we really have something here. So data is the oxygen of science for the more we have the more corrections we will find and in today's world data is growing exponentially.

Example -consider language translation then the old way of doing it would be to discover all its rules and program them. But language is flexible and it is not necessary to know the rules to understand what is being said and what is being said may itself be ungrammatical or ambiguous and in every language there are exceptions to any rules. Modern methods use correlation between words and between phrases and so see that these words and phrases often also correlate to words and phrases in another language and in a way treat everything as an exception so a system just needs in essence large bodies of texts to find the correlations and use then to translate (see Google Translator).​

The Future - With such vast amounts of data it is possible to do what's called 'computational science' where one might be able to statistically change how science is done or even say which science is possible. So one might be able to take what data you have and then run it against a whole range of models or theories or hypotheses to see where a best fit occurs and we might do this with tens of thousands of scenarios or hypotheses where the system can automatically discard poor cases. For example, we have huge databases of weather data but no predictive theory of what happens next. Therefore, with computational science it would be possible to get the computer to 'create' tens of thousands of models/theories and then use the vast data sets to test them one after the other to see if any turn out to be good weather pattern predictors. This area of statistics is in its infancy and as you can imagine requires huge computational resources.
 
In research it is important that you don't block out explicitly or implicitly things you may not like or don't agree with otherwise you will miss major truths.

To have an open mind means to be willing to consider or receive new and different ideas. It means being flexible and adaptive to new experiences and ideas although it does not mean you have to accept everything you hear.

People who are open-minded are willing to change their views when presented with new facts and evidence. Those who are not, and are resistant to change will find life less rewarding and satisfying, not to mention dull. If we limit ourselves to what we know and were more comfortable with in the past, we will become more and more frustrated.

If we choose to approach life in the same way day after day, as well as becoming bored and uninspired, we will reduce our intellectual aptitude. If, on the other hand, we seek new ways of doing and looking at things, we will expand our intellectual capability, find life more exciting, and broaden our experiences. Most people agree that open-mindedness is one of the fundamental aims of education, always elusive but eminently worth pursuing. It is the childlike attitude of wonder and interest in new ideas coupled with a determination to have one's beliefs properly grounded.

Bertrand Russell regarded open-mindedness as the virtue that prevents habit and desire from making us unable or unwilling to entertain the idea that earlier beliefs (of whatever kind) may have to be revised or abandoned; its main value lies in challenging the fanaticism that comes from a conviction that our views are absolutely certain, that WE are right.

'You are obstinate, he is pigheaded, I needless to say, I merely hold firm opinions.' This is Russell's memorable way of making the point that it is enormously difficult to recognise one's own tendencies towards closed-mindedness. We see ourselves as eminently reasonable, and our views as open to discussion, even though it may be perfectly clear to others that we are only going through the motions of giving a serious hearing to a rival
 
Last edited:
Become self-aware
Most of us think we know who we are but often that knowing excludes or pushes into the shadows the uncomfortable things about our personalities and character tending to make us look for excuses rather than facing up to and fixing any failings we have.*

It is vital to know how you see and organises your world and generate meaning from your experiences. You may think of this as examining Weltanschauung, which is often loosely translated as world-view but carries the idea that our world view is shaped by who we are: our culture, our teachers, our religion, our family, our friends, our choices and indeed a whole host of things over which we have in general little control in their formative years and means that two people will not act in an identical way even when confronted by identical circumstances. Indeed, we ourselves may not think, feel and act in the same way from one more or less identical situation to another at a different time or place.*

We cannot do much about who we are but we can become more aware of ourselves, our thinking, our feelings our actions; that will help us stop blaming everything or everyone but ourselves and over time our learning will change who we are because the meaning systems that people adopt, usually unconsciously, are as important or perhaps more important than logic in shaping their thinking. There is nothing really new here for Socrates, almost 3,000 years ago said "The unexamined life is not worth living" because for him growing in knowledge and character was everything. So if something in our lives or experienced cannot be questioned, no matter what it is, then ir may ultimately destroy our growth because it means we have no-go areas where learning cannot take place.

This really is about managing ourselves and not running away or trying to hide flaws in our character and knowledge,AaQστα;Ζ;α;:;α and we all have them, but facing up to them and working to eradicate them. Bill Hybels when talking about himself said "It's a terribly lonely feeling to have no one to blame, look to no one to rescue you. It's rotten to realise that to find the bad guy, you just have to look in the mirror. The truth is that the only person who can put a sustainability programme together for us is us. Anything else is a self-leadership fumble, an illusion."
 
Is your life a muddle? I don't know if it helps but one might think of marble solitaire as a parable of life, your life. In marble solitaire one has to build up a simple pattern, or you can say discipline of moves and if you do, you will always end up with just one marble right in the centre of the board. Without a pattern and the discipline that goes with it you will always be in the unsatisfying position of having marbles left over, which can be all over the place and you never get to the centre. For marble solitaire there are many, many different move patterns that will work but trying to get to the centre without a pattern discipline will leave you continually frustrated and disappointed.

Now what is the centre for you? Interestingly, religious communities have solved this and believers will say their day is centred around God and spiritual devotions and almost always they give themselves as volunteers for other activities within their community and outside it and of course they feel part of a wider supportive and loving community.

For those without faith there are nevertheless ways of centering one's days: it might be around music or sport or volunteering or drama, the list is endless but it most cases the centre is into some community or other. Some centre things around their normal work but I don't think that is a good idea. What we are talking about is a pattern of daily living that brings you a sense of contentment.

Incidentally, there have been studies that show that many decide, even though they have no particular faith allegiance, to join a faith community where they feel at home, cared for and their children grow up in the same safe environment even though they do not necessarily commit as such to any creed.

Takes some time and think about what has been said here and find a discipline, a pattern that centres you daily living. Dietrich Bonhoeffer put it like this "Spirituality [you put in your own word] without discipline is like a river without its banks." St Paul said "At the time, discipline isn't much fun. It always feels like it's going against the grain. Later, of course, it pays off handsomely"
 
This is a short note on writing a critique.*

Firstly, it's never a good idea to leave it right at the end because a good critique needs time and thought to consider all the various points. However, when doing a critique one way is (after reading the paper) to start by asking what problem does the paper or papers address and does it arrive at any sort of conclusion. It might address several problems but that would be a point of critique itself as the paper then might be considered not focused very well - or if you like, anyone who tries to tell you he/she can solve many problems in one go is likely to be seriously over confident.

That idea then gives you a sort of theme to use as you work through the paper. It is also useful if you have in mind some idea (try to avoid several ideas, you need focus) of your own about the subject in the paper. Be careful though with you own ideas and make sure it has some foundation in the available literature.*

Another possible line is to detect if the writer has misunderstood something he has used or modified it in some way. If this is the case then look for the implications of that in the various conclusions as they may be invalid or biased.

Conclusions needs to be looked at with care - the writer may have drawn conclusions that cannot be substantiated and that means they gave up too early or were just lazy or both. There are yet other writers who just cannot get to a conclusion because they feel they don't have all the information. This of course is familiar to almost every Doctor particularly in A&E where it is potentially easy to make a wrong diagnoses or just decide nothing can be done or the other end of the scale where one seeks more and more information so the patient is dying while the Doctor is doing his thinking. This is why one needs constant practice, constant updating of knowledge and colleagues who feel the same way.

Tutors (good ones anyway) always love to see you adding your own ideas because that tells them you are thinking but as I said make sure it has some support. By support I do not necessarily mean copy someone else's idea although that is possible but generate a new idea of your own out of ideas you find elsewhere. One can say for example "I notice Smith said X and that has led me to believe that Y might be a useful idea here... because .... but..
 
Negative stereotypes impose an intellectual burden on many who think that people around them perceive them as inferior in some way, causing them to feel they are not accepted as equals. In fact some cultures impose these stereotypes deliberately on those seen as 'outsiders' in a kind of spite because of their own sense of inferiority or superiority. Even subtle reminders of prejudice against one's sex, race or religion can hinder performance in learning, disempower you and generally stop you fitting in.

In such situations one can find most of your energies used up in fighting these attitudes and it levies an emotional tax that is a form of intellectual emasculation. In fact this emotional tax is known as stereotype threat. What is both sad and worrying is that these social sensitivity effects can seriously damage performance.

One effect of stereotype threat is that when anxiety arrives; motivation falls and expectations lower. One such effect is that there is a depletion of working memory which creates various stumbling blocks to success. Additionally, people tend to over-think actions that would otherwise be automatic and become sensitive to cues that might indicate discrimination. For, example, an ambiguous expression can be misread as a sneer, and even one's own anxiety can become a sign of imminent failure. Minds also wonder, and self control weakens.

There are antidotes to these imposed stereotype threats one of which is disarmingly simple; just ask yourself what is important to you, be it popularity, music, sport, or whatever and one just write for perhaps 15 minutes about why it matters. In fact doing this acts like a mental vaccine that boosts your self-confidence and helps you combat any future stereotype threat.
 
Top