EXTRAS

SOCIAL NETWORKS

Il Social network analysis [SNA] è l’analisi e la misurazione delle relazioni e dei flussi che si instaurano tra persone, gruppi, organizzazioni, cumputers, siti web e altri processi di scambio di informazioni e conoscenze. La rete delle relazioni stabilite (social network) è formata da nodi rappresentanti PERSONE e GRUPPI e linee rappresentanti le relazioni stabilite. Le dimensioni di una vera rete sociale, secondo la “regola dei 150”, sono limitate a circa 150 membri. Questo numero è stato calcolato grazie a studi di sociologia e soprattutto di antropologia, sulla dimensione massima di un villaggio standard.

La SNA provvede ad un’analisi sia matematica che visuale delle relazioni umane. Gli studi in questo campo sono oggi utilissimi per i consulenti manageriali nella gestione di reti di imprese e di affari (ONA=organiztional network analysis) ma anche nei settori di internet. I siti che permettono il social networking sono ormai diffusissimi e si basano proprio sui concetti ricavati dalla SNA.

I ricercatori della SNA misurano l’attività e l’importanza dei nodi in una rete basandosi sui “gradi di separazione”, cioè sul numero di connessioni dirette che ogni nodo possiede. Inoltre gli studiosi osservano anche la centralità di ogni “attore sociale” relazionata alla sua importanza nella rete.

THEORY OF SIX DEGREES OF SEPARATION

Ognuno sul pianeta può essere connesso con altri, secondo una catena che non ha più di 5 intermediari.

La teoria fu proposta per la prima volta da uno scrittore ungherese nel 1929, Frigyes Karinthy, nel suo breve racconto “Chains”. Negli anni ’50 Ithiel de Sola Pool (MIT) e Manfred Kochen (IBM) cercarono una dimostrazione matematica ma riuscirono, dopo 20 anni, solo a formulare una domanda matematicamente corretta: “Dato un sistema N di persone, qual’è la probabilità che ogni membro di N è connesso con un altro dello stesso sistema attraverso k1, k2, k3,…kn collegamenti?”.

Solo nel 1967, un sociologo americano, Stanley Milgram, risolse quello che lui chiamò “the small-world problem.” Scelse delle persone a caso spedendo loro dei pacchetti postali e chiedendo di rispedirli a persone di conoscenza. Queste, a loro volta, dovevano fare altrettanto. I partecipanti estesero così la catena fino ad incudervi almeno 100 intermediari. Tuttavia Milgram vide che il numero di intermediari prima di ricevere un pacchetto variava da 2 a 10, ma il numero più comune era 5. Le sue scoperte furono infine pubblicate su Psychology Today e ispirarono la frase “six degrees of separation.”

Le sue osservazioni spiegano anche perchè gli scherzi, le informazioni e le barzellette viaggiano così rapidamente. D’altra parte…a quante persone inviamo una stessa mail??

Chiara Badia

 

Un esperimento condotto da ricercatori della Columbia University. Confermata la teoria dei sei gradi di separazione.La ricerca ha coinvolto oltre 61 mila persone in 166 nazioni. E ora si cercano nuovi volontari per studi più approfonditi

WASHINGTON – Ha trovato riscontro pratico la teoria dei «sei gradi di separazione», ossia il fatto che su Internet una persona sconosciuta può essere rintracciata attraverso solo sei passaggi tramite il «passaparola». Da questa teoria, finora mai dimostrata, nel 1993 venne trovato lo spunto per un famoso film intitolato infatti «Sei gradi di separazione».ESPERIMENTO – Un esperimento, in cui agli utenti della Rete è stato chiesto di andare alla ricerca di diciotto persone sconosciute di tredici nazioni utilizzando solo le loro connessioni online, ha dimostrato che in media sono bastati dai cinque ai sette passaggi per arrivare a destinazione, chiedendo ad amici e conoscenze. I risultati dell’esperimento, a cui hanno partecipato 61.168 internauti di 166 Paesi, sono stati pubblicati sulla rivista «Science» da una squadra di ricercatori della Columbia University guidati da Peter Sheridan Dodds. Agli utenti è stato chiesto di riuscire a rintracciare una delle 18 persone scelte tra cui un ispettore in Estonia, un consulente tecnologico in India, un poliziotto in Australia e un veterinario in Norvegia. In media ci sono voluti dai cinque ai sette passaggi per arrivare alla meta.
I ricercatori sono stati in grado di ricostruire nei dettagli 24.163 catene di messaggi, di cui 384 erano riuscite a raggiungere l’obiettivo richiesto.
TEORIA DEL 1967 – L’idea dei sei gradi di separazione è nata nel 1967 grazie a un esperimento del sociologo Stanley Milgram, che verifico comeun gruppo di volontari nel Nebraska e nel Kansas fossero in grado di venire a contratto con sconosciuti nel Massachusetts impiegando solo la loro rete di amici e conoscenze. Applicazioni importanti di una simile teoria si possono fare nel campo della diffusione delle malattie, dato che la conoscenza della rete dei contatti permette di individuare le persone che possono trasmettere la malattia da un gruppo a un altro.
Dodds e i suoi colleghi hanno intenzione di eseguire ulteriori analisi sulla teoria, e cercano volontari: chi vuole partecipare può iscriversi al sito http://smallworld.columbia.edu

Tratto da “IL CORRIERE DELLA SERA” e da BBC news del 31.10.2007

—————————————————————–


10 COSE IMPOSSIBILI… CONQUISTATE DALLA SCIENZA!

03-04 2008 Michael Marshall

 WHAT Is TRULY IMPOSSIBLE?We have rounded up 10 things that were once thought scientifically impossible. Some were disproved centuries ago but others have only recently begun to enter the realm of possibility.
1. Analysing stars

In his 1842 book The Positive Philosophy, the French philosopher Auguste Comte wrote of the stars: “We can never learn their internal constitution, nor, in regard to some of them, how heat is absorbed by their atmosphere.” In a similar vein, he said of the planets: “We can never know anything of their chemical or mineralogical structure; and, much less, that of organized beings living on their surface.”

Comte’s argument was that the stars and planets are so far away as to be beyond the limits of our sense of sight and geometry. He reasoned that, while we could work out their distance, their motion and their mass, nothing more could realistically be discerned. There was certainly no way to chemically analyse them.

Ironically, the discovery that would prove Comte wrong had already been made. In the early 19th century, William Hyde Wollaston and Joseph von Fraunhofer independently discovered that the spectrum of the Sun contained a great many dark lines.

By 1859 these had been shown to be atomic absorption lines. Each chemical element present in the Sun could be identified by analysing this pattern of lines, making it possible to discover just what a star is made of.

2. Meteorites come from space

Astronomers look away now. Throughout the Renaissance and the early development of modern science, astronomers refused to accept the existence of meteorites. The idea that stones could fall from space was regarded as superstitious and possibly heretical – surely God would not have created such an untidy universe?

The French Academy of Sciences famously stated that “rocks don’t fall from the sky”. Reports of fireballs and stones crashing to the ground were dismissed as hearsay and folklore, and the stones were sometimes explained away as “thunderstones” – the result of lightning strikes.

It was not until 1794 that Ernst Chladni, a physicist known mostly for his work on vibration and acoustics, published a book in which he argued that meteorites came from outer space. Chladni’s work was driven by a “fall of stones” in 1790 at Barbotan, France, witnessed by three hundred people.

Chladni’s book, On the Origin of the Pallas Iron and Others Similar to it, and on Some Associated Natural Phenomena, earned him a great deal of ridicule at the time. He was only vindicated in 1803, when Jean-Baptiste Biot analysed another fall of stones at L’Aigle in France, and found conclusive evidence that they had fallen from the sky.

3. Heavier-than-air flight

The number of scientists and engineers who confidently stated that heavier-than-air flight was impossible in the run-up to the Wright brothers’ flight is too large to count. Lord Kelvin is probably the best-known. In 1895 he stated that “heavier-than-air flying machines are impossible”, only to be proved definitively wrong just eight years later.

Even when Kelvin made his infamous statement, scientists and engineers were closing rapidly on the goal of heavier-than-air flight. People had been flying in balloons since the late eighteenth century, and by the late 1800s these were controllable. Several designs, such as Félix du Temple’s Monoplane, had also taken to the skies, if only briefly. So why the scepticism about heavier-than-air flight?

The problem was set out in 1716 by the scientist and theologian Emanuel Swedenborg in an article describing a design for a flying machine. Swedenborg wrote: “It seems easier to talk of such a machine than to put it into actuality, for it requires greater force and less weight than exists in a human body.”

Swedenborg’s design, like so many, was based on a flapping-wing mechanism. Two things had to happen before heavier-than-air flight became possible. First, flapping wings had to be ditched and replaced by a gliding mechanism. And secondly, engineers had to be able to call on a better power supply – the internal combustion engine. Ironically, Nicolaus Otto had already patented this in 1877.

4. Space flight

From atmospheric flight, to space flight. The idea that we might one day send any object into space, let alone put men into orbit, was long regarded as preposterous.

The scepticism was well-founded, since the correct technologies were simply not available. To travel in space, a craft must reach escape velocity – for vehicles leaving Earth, this is 11.2 kilometres per second. To put this figure into perspective, the sound barrier is a mere 1,238 kilometres per hour, yet it was only broken in 1947.

Jules Verne proposed a giant cannon in his novel From the Earth to the Moon. However, such a sudden burst of acceleration would inevitably kill any passengers instantly, and calculations have shown no cannon could be powerful enough to achieve escape velocity.

The problem was effectively cracked in the early 20th century by two rocket researchers working independently – Konstantin Tsiolkovsky and Robert Goddard. Tsiolkovsky’s work was ignored outside the USSR, while Goddard withdrew from the public gaze after scathing criticism of his ideas. Nonetheless, the first artificial satellite, Sputnik, was eventually launched in 1957, and the first manned spaceflight followed four years later. Neither Tsiolkovsky nor Goddard lived to see it.

5. Harnessing nuclear energy

On 29 December 1934, Albert Einstein was quoted in the Pittsburgh Post-Gazette as saying, “There is not the slightest indication that [nuclear energy] will ever be obtainable. It would mean that the atom would have to be shattered at will.” This followed the discovery that year by Enrico Fermi that if you bombard uranium with neutrons, the uranium atoms split up into lighter elements, releasing energy.

Einstein’s scepticism was, however, overtaken by events. By 1939, nuclear fission was better understood and researchers had realised that a chain reaction – one that, once started, would drive itself at increasing rates – could produce a huge explosion. In late 1942, such a chain reaction was produced experimentally, and on August 6 1945 the first atomic bomb used aggressively exploded over Hiroshima. Ironically, Fleet Admiral William Leahy allegedly told President Truman: “This is the biggest fool thing we’ve ever done – the bomb will never go off – and I speak as an expert on explosives.”

Then, in 1954, the USSR became the first country to supply some of its electricity from nuclear power with its Obninsk nuclear power plant.

6. Warm superconductors

This is a strange case: a phenomenon can be observed and measured, but should not be happening. According to the best theories of superconductivity, the phenomenon of superconductivity should not be possible above 30 Kelvin. And yet some superconductors work perfectly well at 77 K.

Superconductors – materials that conduct electricity with no resistance – were first discovered in 1911. To see the effect, a material normally has to be cooled to within a few degrees of absolute zero.

Over the next 50 years, many superconducting materials were discovered and studied, and in 1957 a complete theory describing them was put forward by John Bardeen, Leon Cooper and John Schrieffer. Known as “BCS theory”, it neatly explained the behaviour of standard superconductors.

The theory states that electrons within such materials move in so-called Cooper pairs. If a pair is held together strongly enough, it can withstand any impacts from the atoms of the material, and thus experiences zero electrical resistance. However, the theory suggested that this should only be true at extremely low temperatures, when the atoms only vibrate slightly.

Then, in a classic paper published in 1986, Johannes Georg Bednorz and Karl Alexander Müller turned the field upside-down, discovering a material capable of superconducting at up to 35 K. Bednorz and Müller received the Nobel Prize for Physics the following year and more high-temperature superconductors followed. The highest cutoff temperature yet observed (admittedly under pressure) is 164 K. Yet, quite how this is all possible remains a topic of intense research.

7. Black holes

People who think of black holes as a futuristic or modern idea may be surprised to learn that the basic concept was first mooted in 1783, in a letter to the Royal Society penned by the geologist John Michell. He argued that if a star were massive enough, “a body falling from an infinite height towards it would have acquired at its surface greater velocity than that of light… all light emitted from such a body would be made to return towards it by its own proper gravity.”

However, throughout the 19th century the idea was rejected as outright ridiculous. This was because physicists thought of light as a wave in the ether – it was assumed to have no mass, and therefore to be immune to gravity.

It was not until Einstein published his theory of general relativity in 1915 that this view had to be seriously revised. One of the key predictions of Einstein’s theory was that light rays would indeed be deflected by gravity. Arthur Eddington’s measurements of star positions during a solar eclipse showed that their light rays were deflected by the Sun’s gravity – though actually the effect was too small for Eddington’s instruments to reliably observe, and it was not properly confirmed until later on.

But, once relativity was established, black holes became a serious proposition and their properties were worked out in detail by theoreticians such as Subrahmanyan Chandrasekhar. Astronomers then began searching for them, and accumulated evidence that black holes are common with one at the centre of many galaxies (including our own) and the biggest ones being responsible for high-energy cosmic rays.

Perhaps the debate has not been entirely settled, though. Some controversial calculations, published in 2007, suggested that as stars collapsed into black holes, they would release a great deal of radiation, reducing their mass so that they do not form “true” black holes after all.

8. Creating force fields

This classic of science fiction went from wild speculation to verifiable fact in 1995 with the invention of the “plasma window“.

Devised by Ady Hershcovitch from the Brookhaven National Laboratory, the plasma window uses a magnetic field to fill a small region of space with plasma or ionised gas. The devices, developed by Hershcovitch and the company Acceleron, are used to reduce the energy demands of electron beam welding.

The plasma window has most of the properties we associate with force fields. It blocks matter well enough to act as a barrier between a vacuum and the atmosphere. It also allows lasers and electron beams to pass through unimpeded and will even glow blue, if you make the plasma out of argon.

The only drawback is that it requires huge amounts of energy to produce plasma windows of any size, so current examples are very small. In theory, though, there is no reason they could not be made much bigger.

9. Invisibility

Invisibility is another staple of fantasy fiction, appearing in everything from Richard Wagner’s opera Das Rheingold to H. G. Wells’ The Invisible Man, and of course Harry Potter.

There is nothing in the laws of physics to say invisibility is impossible, and recent advances mean certain types of cloaking device are already feasible.

The last few years have seen a rash of reports concerning experimental invisibility cloaks, ever since a basic design for one was produced in 2006. These devices rely on metamaterials to guide light around objects. The first of these only worked on microscopic objects and with microwaves.

It was thought that modifying the design for visible light would prove very challenging, but in fact it was done just one year later – albeit only in two dimensions and on a micrometre scale. The engineering challenges involved with building a practical invisibility cloak remain formidable.

10. Teleportation

This is a word with a long and rather dubious history. It was coined by the paranormalist writer Charles Fort in his book Lo! and was subsequently seized on by legions of science fiction writers; most famously as the “transporter” in Star Trek.

Despite its fantastical origins, physicists have achieved a kind of teleportation thanks to a bizarre quantum phenomenon called entanglement. Particles that are entangled behave as if they are linked together no matter how wide the distance between them. If, for example, you change the “spin” of one entangled electron, the spin of its twin will change as well.

Entangled particles can therefore be used to “teleport” information. Performing the trick with anything larger than an atom was once thought impossible, but in 2002 a theoretical way to entangle even large molecules, providing they can be split into a quantum state known as superposition, was described.

More recently, an alternative idea, dubbed “classical teleportation”, was proposed for making a beam of rubidium atoms effectively disappear in one place and reappear elsewhere. This method would not rely on entanglement, but transmitting all the information about these atoms through a fibre optic cable so that they can be “reconstructed” somewhere else. 

 

 

 

 

————————————————————————————————————————— 

 

 

OGM & SOCIETY

 

 

————————————————————————————————————————— 

CLIMATE CHANGE AND SOCIETY

HOW CLIMATE, SOCIETY AND SCIENTIFIC COMMUNICATION ARE CHANGING: A CASE STUDY –  OCTOBER 2008

ABSTRACT:

Although our brief life, we think about ourselves as immortal and we feel owners and responsible for the place where we live. This way of thinking is shown when talking about “global climate change”. Although many data showed that climate and temperatures had always been changing, we usually don’t considerate it as dynamic.

Anyway, the core of the climate question is pointed out by some questions: are we responsible for it or not? And if yes, are we on time to remediate? In which ways? Besides these delicate aspects, it is clear almost to everyone that climate is changing. Or at least, some of the parameters for climate are changing. Above all temperature, that seems to constantly have higher values. In the last thirty years, simulations and models have shown that temperature increase is related to the major concentrations of the so called “greenhouse gases” (GHG) in the atmosphere. These are becoming more and more important and popular in our everyday life, especially for making government policies and programs to protect the quality of air and the quality of life.  

Predictions and prevention actions are made using climate models. But it’s necessary to always remind that climate models are just an important instrument for climate change scientists, but they are not climate science.

Anyway, when a model is well defined and approximates the reality, it can be used for predicting effects on particular geographical and social regions. This method is known as “vulnerability” and it’s useful to regionally see the several responses from ecosystems and agricultural, social and economical contingents to the climate change.

Furthermore, we have to consider that the Earth system is subjected to many variations and climate is just one of these. Another important aspect to be considered is the growth of the world population that influences the way the resources are dislocated and used, and the different regional socio-economic development, which includes migrations to coasts and birth of mega cities.

“Global climate change” has become something more ideological, political, social and religious too. This has sometimes led to considerations very far from the real scientific situation and knowledge. Additionally, the situation gets more complicated owing to the “global” aspect of climate change: each State has its own climate change policy, according to different necessities, priorities and ideologies.

Facing climate and global changes, many aspects are involved: from society to politics, from pure science to psychology, etc…

There is an increasing interest toward climate change theme and this can be simply noticed also with the fact that a Nobel Prize has been recently assigned for this issue and that at least once a day we hear about climate and global changes form media.

This work offers an overall vision on the different issues encountered when talking about climate change. Thus, each of the three different sections of this work presents a different approach and point of view about climate issue.

Section 1 is dedicated to what scientists and experts know and say about climate change reporting all the problems and effects that climate change causes or will probably produce in future, models, theory and policies accompanied by graphics and tables. In Section 2 instead I’ve reported all the cognitive and psychological factors which influence people’s understanding of climate change; moreover I’ve analyzed the importance of communication strategies and methods used to catch publics’ attention and to motivate social mitigating action in favour of climate change. Precautionary principle, making decision and policies, ecological footprint and perception of risks are also discussed in this part, which ends with the proposal of new communication and educational strategies and solution. 

Section 3 contains two examples for a fresh approach to climate change communication and knowledge: a new EU project called “CIRCE – Climate Change and Impact Research: the Mediterranean Environment” which represents a sort of a European IPCC and the case study of the International Summer School on Climate Change and Water Cycle, a new climate change educational strategy held in the United World College of Adriatic.

CIRCE project represents a new way for studying and analyzing climate change effects and solutions from a regional and local point of view, the Mediterranean context for instance, also considering communication, adaptation and mitigation. The Summer School initiative was organized this year as the first occasion when climate and environmental experts came to speak to a young public to educate and inform them about climate changes and hydrological systems. The project was proposed in a multicultural context with the specific intent to involve and prepare new generations for a society able to face complex situations such as global changes.

 ITALIAN VERSION:                                   

Nonostante la nostra sia una vita breve rispetto a quella dell’Universo, ci sentiamo immortali e in un certo senso padroni del pianeta in cui viviamo. Questo modo di pensare si riflette anche nei nostri modi di pensare e agire riguardo al cambiamento climatico. Molti dati mostrano che il clima sta cambiando, anche se in realtà il clima è non è mai stato statico.

Considerando il cambiamento climatico alcune sono le domande fondamentali da porsi: siamo noi i diretti responsabili? E in questo caso possiamo rimediare? In quale modo? Oltre questi aspetti e questioni delicate, è comunque evidente alla maggior parte di noi che il clima, o almeno alcuni parametri, stanno cambiando; uno su tutti la temperatura, che sembra aumentare progressivamente negli anni. Molti modelli scientifici riportano il trend di temperature sempre maggiori, fondamentalmente correlati all’incremento delle emissioni di gas serra nell’atmosfera. Questi gas, responsabili del cosiddetto “effetto serra” sono al centro dell’attenzione perché coinvolti in quasi tutte le nostre attività quotidiane e sono ormai importanti fattori da considerare nei processi decisionali e nei programmi dei governi.

Gli studi e le osservazioni sul clima vengono effettuate utilizzando modelli climatici, che dovrebbero essere considerati ottimi strumenti per gli scienziati che osservano il clima, ma non sono la vera scienza del clima. I modelli sono, infatti, utili quando sono ben definiti e riescono ad approssimare la realtà: in questo modo vengono poi usati per predire effetti e situazioni a livello locale per particolari regioni sociali e geografiche.

Questo è l’approccio della vulnerabilità ed è utilizzabile per analizzare e osservare le dinamiche e le risposte regionali per ogni settore coinvolto nel cambiamento climatico, ad esempio l’agricoltura gli ecosistemi e gli aspetti sociali.

Inoltre, il cambiamento climatico non è il solo fattore che destabilizza l’equilibrio terrestre. Molto importante è la costante crescita demografica che ha effetti sul consumo e la distribuzione delle risorse, sulla diffusione delle persone nell’ambiente e il relativo sviluppo socio-economico (per es. nascita delle metropoli e maggiori insediamenti costieri).

 

Parlare del cambiamento climatico non è comunque solo questo: l’argomento ha ormai abbracciato campi differenti e distanti dalla scienza e gli aspetti coinvolti sono psicologici, sociali, politici, ideologici e anche religiosi. Il clima che varia è diventato una sorta di ombrello che assorbe tutte le problematiche globali, come fosse un capro espiatorio, per questo è molto difficile distinguere la verità scientifica dalle dinamiche socio-politiche. Inoltre, è un argomento ancora più delicato, proprio per la sua natura “globale”, che coinvolge tutto il pianeta ma viene affrontato differentemente a seconda delle politiche e delle necessità dei singoli Stati.

Gli aspetti coinvolti quando si parla di cambiamento climatico sono vari e il crescente interesse sull’argomento è dimostrato anche dalla recente assegnazione del Premio Nobel per la Pace ad Al Gore, per il suo sostegno alla scienza del clima e alle continue notizie che arrivano dai mass media.

 

Questo lavoro offre una visione generale dei temi che riguardano il cambiamento climatico ed è quindi suddiviso in tre parti, ognuna riguardanti diversi aspetti coinvolti.

La Sezione 1 contiene tutte le informazioni scientifiche sul clima, presentando modelli, dati e grafici sulle cause e gli effetti del clima.

Nella Sezione 2 invece sono riportati tutti gli aspetti psicologici e cognitivi che influenzano il nostro approccio alle tematiche ambientali e climatiche. Inoltre il lavoro presenta anche l’analisi delle diverse forme e strategie di comunicazione usate per attirare l’attenzione e motivare le azioni rivolte al cambiamento climatico. Principio di precauzione, processi decisionali, “impronta ecologica” gestione e percezione del rischio sono discussi in questa sezione che propone alla fine anche nuove alternative per migliorare e incentivare la comunicazione e l’educazione sul clima.

Nella Sezione 3 ho riportato due esempi per un nuovo approccio nella comunicazione e lo studio del cambiamento climatico. Un nuovo progetto europeo “CIRCE – Climate Change and Impact Research: the Mediterranean Environment” che rappresenta una sorta di IPCC europeo e il caso pratico basato sull’”International Summer School on Climate Change and Water Cycle”, una nuova strategia di educazione sul clima organizzata dall’United World College of Adriatic.

CIRCE rappresenta un nuovo metodo di studio e analisi del clima focalizzato sull’analisi degli aspetti locali e regionali, ad esempio ristretti all’area mediterranea, considerando anche gli aspetti sociali, la mitigazione e la vulnerabilità dei sistemi.

Il Summer School è stato invece organizzato quest’anno come prima iniziativa in cui scienziati di alto livello parlano ad un pubblico senza intermediari, in particolare ad un pubblico di giovani che costituiranno la nuova classe dirigente. L’obiettivo centrale è quello infatti di educare e fornire strumenti adeguati per una futura società che deve affrontare situazioni complesse come il cambiamento climatico.

CHIARA BADIA

All rights reserved

if you may want to know more about this issue, please contact me at kiarabadia@libero.it

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s