IL RISVEGLIO DEL CADUCEO DORMIENTE: la vera genesi dell'Homo sapiens

IL RISVEGLIO DEL CADUCEO DORMIENTE: la vera genesi dell'Homo sapiens
VIDEO TRAILER

VIDEO SINOSSI DELL'UOMO KOSMICO

VIDEO SINOSSI DELL'UOMO KOSMICO
VIDEO SINOSSI DELL' UOMO KOSMICO
Con questo libro Marco La Rosa ha vinto il
PREMIO NAZIONALE CRONACHE DEL MISTERO
ALTIPIANI DI ARCINAZZO 2014
* MISTERI DELLA STORIA *

con il patrocinio di: • Associazione socio-culturale ITALIA MIA di Roma, • Regione Lazio, • Provincia di Roma, • Comune di Arcinazzo Romano, e in collaborazione con • Associazione Promedia • PerlawebTV, e con la partnership dei siti internet • www.luoghimisteriosi.it • www.ilpuntosulmistero.it

LA NUOVA CONOSCENZA

LA NUOVA CONOSCENZA

GdM

lunedì 29 aprile 2013

IL “GENIO” ITALIANO… NONOSTANTE TUTTO…NON E’ MORTO !



All’interno intervista esclusiva al Dott. Giuseppe Cotellessa di ENEA.

di: Marco La Rosa 

Il “PALAZZO DELLA CIVILTA’ ITALIANA”, chiamato anche “della Civiltà del Lavoro” e “Colosseo quadrato” (per via dei 54 archi per facciata), è un edificio di carattere monumentale, sorge a Roma nel quartiere dell’EUR.  Progettato nel 1936-37 fu iniziato nel 1938 ed inaugurato, ancora incompleto nel 1940. Fu ultimato dopo il 1945. Oggi è un edificio di interesse culturale e quindi vincolato ad usi espositivi e museali.

                                               
Negli archi del piano terreno, sono collocate 28 statue rappresentanti  le virtù del popolo italiano: EROISMO, MUSICA, ARTIGIANATO, GENIO POLITICO, ORDINE SOCIALE, LAVORO, AGRICOLTURA, FILOSOFIA, COMMERCIO, INDUSTRIA, ARCHEOLOGIA, ASTRONOMIA, STORIA, GENIO INVENTIVO, ARCHITETTURA, DIRITTO, PRIMATO DELLA NAVIGAZIONE, SCULTURA, MATEMATICA, GENIO DEL TEATRO, CHIMICA, STAMPA, MEDICINA, GEOGRAFIA, FISICA, POESIA, PITTURA, GENIO MILITARE.

ABBIAMO DIMENTICATO TUTTO QUESTO. ABBIAMO PERDUTO LA NOSTRA IDENTITA’.

La “SCUOLA”, nel tempo è stata “cannibalizzata” di tutte le risorse possibili, non è praticamente più in grado di trasmettere questi valori, che nel tempo si sono prima diluiti e poi persi. Restano parole vuote scolpite in qualche monumento, caduto pure quello nell'oblio.
Ma il DNA non è fantasia, per cui senza averne coscienza, ognuno di noi ha dentro se il “seme” di questo immenso patrimonio. Quindi nonostante tutto, esso continua a germogliare, ed anche se non ce ne accorgiamo la pianta cresce, cresce e tende verso l’alto, verso il sole, pure senza acqua.

Ho trovato una di queste "piante", non per caso. Perché se cerchi trovi…eccome se trovi.
  
                                                      
L’ ENEA, Agenzia Nazionale per le nuove tecnologie, l’energia e lo sviluppo economico sostenibile. E’ un fiore all’occhiello tutto italiano, specchio di quel patrimonio “scolpito” nel travertino lassù. Fucina di idee e scoperte che fanno invidia al mondo intero: Efficienza energetica, fonti rinnovabili, nucleare, ambiente e clima, sicurezza e salute, nuove tecnologie… In questi tempi “cupi” è come un gigante semi addormentato. Innumerevoli brevetti di importanza epocale non solo per l’Italia, ma per tutti i paesi del mondo, giacciono polverosi nei cassetti: mancano le aziende, mancano le risorse per “studiare” e “sviluppare” i prototipi che poi faranno da volano al rilancio dell’economia… le eccezioni sono poche, ma danno il polso di una situazione incredibile, basterebbe veramente poco per cambiare le cose.

Leggete qua:

Un significativo passo avanti per il rilancio delleconomia italiana si sta compiendo grazie agli investimenti fatti nella ricerca e nel sistema industriale ad alta tecnologia, che nonostante la crisi, è uno dei pochi settori che ha accresciuto la propria competitività, e la creazione di nuovi posti di lavoro. ha dichiarato Giovanni Lelli, Commissario dellENEA, intervenendo alla cerimonia di avvio dei lavori per la costruzione di una macchina sperimentale per la produzione di energia da fusione nucleare in Giappone, con lassemblaggio dei primi componenti arrivati dallEuropa.
Si tratta di un programma internazionale  tra Europa e Giappone, alla cui realizzazione lENEA ha contribuito con il suo progetto e lavorando in sinergia con le aziende italiane che forniscono alcuni componenti essenziali. La collaborazione tra il sistema della ricerca pubblica e quello dellindustria nazionale dei sistemi energetici tecnologicamente avanzati ha portato alla qualificazione di prodotti tecnologici che ora possono competere e vincere in tutti i mercati mondiali. LENEA ha messo a disposizione le infrastrutture tecnologiche dei Centri di ricerca  di Frascati e del Brasimone, che sono tra i più avanzati in Europa, e le competenze tecnico-scientifiche dei suoi ricercatori per sviluppare e qualificare i componenti tecnologici delle nostre maggiori realtà industriali per il raggiungimento di quei livelli qualitativi di eccellenza che hanno determinato  il successo italiano in tutti i programmi internazionali per la fusione nucleare.

                              
In particolare, le industrie italiane ad alta tecnologia sono riuscite a cogliere lopportunità offerta dalla costruzione del reattore ITER, il più importante programma internazionale di fusione nucleare, aggiudicandosi commesse del valore totale di circa 750 milioni di euro per componenti rilevanti, tra i quali i magneti superconduttori.
Un successo ottenuto grazie al ruolo che lENEA svolge da più di 20 anni come coordinatore nazionale dei programmi europei sulla fusione nucleare finalizzati ad ottenere unenergia green, che non comporta i rischi legati al nucleare da fissione, che è la stessa energia che si verifica nelle stelle.


 Ecco un esempio pratico:

Un’intervista esclusiva al Dott.  Giuseppe Cotellessa appunto di ENEA, che ci onora della sua presenza sul nostro sito-blog  e che ringraziamo per la seguente spiegazione semplificata della sua scoperta:

Metodo per l’analisi di immagini acquisite da strumenti di indagine nucleare
 “Giuseppe Cotellessa dell’Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (INMRI) dell'ENEA è l’inventore di un procedimento fisico-matematico che permette un’analisi corretta della “natura” e la “misura” affidabile delle dimensioni degli oggetti osservati nelle immagini acquisite da strumenti di indagine nucleare ed estensibile ad immagini non nucleari, per esempio: immagini radar, sonar, TAC, RMN, radiografiche, ecografiche, da microscopi elettronici, ottici e telescopi.  Il procedimento può essere applicato anche per migliorare la precisione della lettura dei rilevatori di tracce nucleari, come quelli utilizzati per misurare l’esposizione al radon e ai neutroni in ambienti di lavoro per la radioprotezione dei lavoratori, come anche garantire la sicurezza meccanica nel funzionamento dei componenti utilizzati negli impianti nucleari, contribuire in modo significativo agli studi di ricerca sulla fusione nucleare ed agli studi di ricerca nucleare in generale in quanto è in grado di rilevare ed eliminare i segnali provenienti dalle pseudo tracce, cioè quei segnali prodotti dalla presenza di impronte digitali sul rilevatore o da imperfezioni del materiale”.
“I sistemi di indagine nucleare finora utilizzati si basano sulla osservazione da parte di sistemi automatici di analisi di oggetti su immagini bidimensionali a diverse tonalità di grigio, ricostruiti a partire dalla misura del numero di danni delle radiazioni nucleari, provocate sulla superficie del rivelatore, captate da una telecamera, dopo riflessione o trasmissione sul o attraverso il rivelatore di un fascio luminoso. L’interpretazione dell’immagine ottenuta è affidata esclusivamente a procedimenti matematici di software che analizzano nella maggior parte immagini trasformate in formato binario con notevole perdita di informazioni utili per l’interpretazione degli oggetti. Gli oggetti delle immagini analizzate in campo diagnostico e non (immagini radar, sonar, TAC, RMN, radiografiche, ecografiche, da microscopi elettronici, ottici e telescopi) spesso sono analizzate attraverso l’occhio umano dell’operatore, con notevoli errori nell’interpretazione della natura degli oggetti, nella misura nel numero e delle dimensioni degli oggetti interpretati.
Il procedimento brevettato consente di ricostruire grafici tridimensionali facilmente interpretabili dall’occhio umano, che sono ottenuti effettuando più letture dello stesso rilevatore, per diversi valori di intensità luminosa. Ciò permette di differenziare le tracce nucleari emesse o trasmesse dall’oggetto indagato, dalle pseudo tracce reali dovute alla presenza di impronte digitali sul rilevatore o a imperfezioni del materiale, e da quelle virtuali dovute a una non corretta impostazione dei parametri di lavoro. L’eliminazione dei segnali delle pseudo tracce consente di ottimizzare i parametri di lavoro e migliorare l’accuratezza e la riproducibilità della lettura”.
Il brevetto, di proprietà ENEA, è stato depositato il 13 dicembre 2012 con numero RM2012A000637. È consultabile nella banca dati Brevetti ENEA dal 19 dicembre 2012 ed è disponibile per licensing.

Chi è Giuseppe Cotellessa:
Laurea in fisica alla Sapienza 1982, ricercatore all’ENEA dal 1985.
Specializzato in metrologia dei gas, ha sviluppato prototipi di misura del gas radioattivo radon (celle elettrostatiche), ha contribuito alla risoluzione di problematiche relative alla taratura degli strumenti di misura del radon e di sviluppo dei campioni relativi.
Ha partecipato a contratti nazionali ed internazionali con funzione da parte di ENEA di laboratorio garanzia per la taratura degli strumenti di misura: Italia, Germania Orientale (Lipsia).
Ha partecipato alla realizzazione di due sistemi integrati per la misura del radon e dei figli del radon, denominati Radotron. Ha partecipato nel progetto di realizzazione dei prototipi di flussimetri “Seeback” dell’Istituto tedesco di Jena per le alte tecnologie, occupandosi della caratterizzazione ambientale tramite flussimetri miniaturizzati in ambiente controllato. Questo progetto è stato selezionato dalla Comunità Europea come esempio di “successo storico”.
Ha partecipato al primo interconfronto italiano degli strumenti di misura passivi in Italia.
Ha partecipato a diversi interconfronti con strumenti di misura passivi del radon in Inghilterra presso NRPB e a Berlino.
Nel laboratorio ha preso parte alla progettazione e ristrutturazione del sistema di gestione della camera radon con microclima controllato.
Ha effettuato studi di caratterizzazione della radioprotezione dei lavoratori dal gas radon nella camera radon praticabile del laboratorio.
Ha partecipato alla realizzazione di numerosi circuiti ed apparati per differenti finalità.
Il compito attuale è quello di realizzare il primo campione primario assoluto per il radon per l’INMRI (Istituto Nazionale di Metrologia delle radiazioni Ionizzanti) in Italia.

MLR: Dott. Cotellessa, può farci qualche esempio pratico per aiutare la gente comune, a capire che la sua scoperta può davvero migliorare in modo sensibile tanti aspetti della vita di società?

G.C.: Finora lo stato dell’arte da me raggiunto è di aver verificato sperimentalmente la validità del procedimento fisico-matematico, con un lavoro di ricerca applicata durato quasi cinque anni (da Agosto 2007) utilizzando i mezzi disponibili nel laboratorio di tracce nucleari dell’INMRI, in cui ho avuto la possibilità di sviluppare l’attività  di ricerca dal 1985. In questo periodo di tempo sempre nel campo dell’uso dei rivelatori di trace nucleari a stato solido, ho depositato altri due brevetti di invenzioni originali:

1)RM2008A000148
“Processo per lo Sviluppo di Tracce Nucleari Identificabili mediante la Loro Intensità Luminosa Rispetto ad Altre Tracce Agglomerate, e Dispositivo per la Sua Attuazione” del 17-3-2008

2)RM92A000540
“Procedimento per la Separazione Automatica delle Tracce con un Analizzatore di Immagini Utilizzando l'Immagine Originaria.” Del 15-7-1992.)

Queste ricerche e relativi brevetti hanno un comune obiettivo di migliorare gli aspetti metrologici dei processi basati  sulle applicazioni delle analisi delle immagini.                                                   

Hanno aperto delle prospettive per applicazioni in diversi campi, anche nel settore industriale e pertanto l’ENEA ha ritenuto opportuno di proteggere  i risultati con la deposizione di un brevetto.
I passi ulteriori da effettuare sono quelli di trovare i finanziamenti adeguati per continuare le ricerche;  di promuovere le applicazioni in collaborazione con le ditte interessate, preferibilmente italiane, concedendo loro la licenza d’uso del brevetto ed incorporare poi, lo stesso,   in prototipi multidisciplinari, con la  prospettiva di immettere sul mercato l’innovazione.
Tutto ciò,  potrà portare alla realizzazione di nuovi dispositivi  in campo medico, migliorando la qualità di diverse tecnologie diagnostiche basate sull’analisi delle immagini.
Ad esempio, per quanto riguarda i trapianti di organi,  riconoscere in modo automatico,  in un campione composto di cellule morte e vive, la percentuale di cellule vive, fondamentale nei  test  per la determinazione della compatibilità dei tessuti tra donatore e ricevente. 
Procedendo per ipotesi, senza aver ancora ottenuto le dovute verifiche sperimentali, penso a nuovi dispositivi nel campo della produzione industriale.
Il procedimento consente di migliorare le prestazioni dei microscopi ottici, elettronici, che hanno larga applicazione. ecc.
Si potrebbero affrontare i problemi più complessi da un punto di vista fisico, come la gestione dei sensori di centrali solari termodinamiche con un procedimento molto più snello e semplificato.

MLR: Dott. Cotellessa, quanto importante sarà la sensibilizzazione della società, della scuola a tutti i livelli (in questo momento di profonda crisi generale), affinché il “genio” e la “ricerca” italiana ritrovino “splendore” di fronte al mondo, ma sopratutto a quella parte del nostro paese ormai profondamente disillusa?

G.C:  L’innovazione costituisce un fattore significativo per dare impulso alla ripresa socio-economica del paese. Questo processo deve essere reso operativo a tutti i livelli, in particolare nella scuola.

MLR: Dott. Cotellessa, è d’accordo sul fatto che se le Istituzioni italiane arresteranno l’”emorragia di cervelli” dal nostro paese, con atti concreti, l’economia tutta ne beneficerebbe da subito?

G.C.: .: La ricerca italiana, sia pubblica che privata, consentirebbe se opportunamente valorizzata, un impulso positivo per superare le attuali condizioni di ristagno nel nostro paese, e quindi  anche “mantenere” in Italia i ricercatori che si sono formati nelle nostre università e nei centri di ricerca.
In conclusione vorrei evidenziare che il lavoro sperimentale è frutto di lavoro di equipe con il concorso di altri ricercatori, ma anche della struttura operativa dell’INMRI dell’ENEA che ha consentito e valorizzato questa linea di ricerca, portando anche alla brevettazione dei risultati e dell’innovazione.
In particolare ringrazio il Dott. Pierino De Felice, responsabile attuale dell'INMRI, il Dott. Marco Capogni responsabile della sezione dell'INMRI per lo sviluppo dei campioni primari (appartengo a questa sezione) e il Dott. Giuliano Sciocchetti mio ex-responsabile, attualmente in pensione, che ancora mi segue quasi quotidianamente nelle mie avventure scientifiche, Elvio Soldano (chimico) e Massimo Pagliari (tecnico). Queste personalità, eccezionali all'interno dell'ENEA, mi hanno creato le condizioni indispensabili per poter conseguire questi risultati importanti.

MLR: Dott. Cotellessa, la ringraziamo per la sua disponibilità e chiarezza, nella viva speranza che quel “seme” di cui parlavo all’inizio di questo articolo, custodito nelle nuove generazioni,  possa davvero trovare in ITALIA “terreno fertile” in cui germogliare e crescere… questa volta con abbondanza di acqua.

SE TI E' PIACIUTO QUESTO POST NON PUOI PERDERE:

LA VERA "GENESI" DELL'UOMO E' COME CI HANNO SEMPRE RACCONTATO? OPPURE E' UNA STORIA COMPLETAMENTE DIVERSA?

"L'UOMO KOSMICO", TEORIA DI UN'EVOLUZIONE NON RICONOSCIUTA"
" IL RISVEGLIO DEL CADUCEO DORMIENTE: LA VERA GENESI DELL'HOMO SAPIENS"
DI MARCO LA ROSA
SONO EDIZIONI OmPhi Labs




ACQUISTABILI DIRETTAMENTE DAL SITO OmPhi Labs ED IN LIBRERIA



3.187 commenti:

«Meno recenti   ‹Vecchi   201 – 400 di 3187   Nuovi›   Più recenti»
Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Riprogramma il mondo per una nuova era dell'ingegneria.



La quarta rivoluzione industriale è alle porte.
Tutti saremo coinvolti in questo nuovo paradigma, che si basa sull’interazione del mondo fisico con le tecnologie emergenti, dall’internet delle cose, al M2M (machine-to-machine), ai Big Data. L'elemento chiave di questa nuova visione è rappresentato dai sistemi ciber-fisici (CPS), cioè quei sistemi che richiedono calcolo, comunicazione e capacità di controllo e intelligenza. E' una definizione molto ampia, che descrive quelle applicazioni in cui il mondo fisico e quello digitale si fondono, grazie alle nuove tecnologie. Sistemi e reti controlleranno processi fisici, dalle macchine ai robot.
Interi settori industriali, dall’energia ai trasporti, dalla produzione alla sanità, subiranno dei mutamenti e bisognerà farsi trovare pronti ad affrontare le nuove tecnologie sviluppando metodi di progettazione in grado di offrire prestazioni adeguate alle nuove esigenze integrando sensori, cloud e controlli real-time. La Progettazione Grafica di Sistemi, l'approccio di National Instruments basato su piattaforma hardware e software, è lo strumento più adatto per affrontare questi mutamenti e realizzare i sistemi ciber-fisici del futuro.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Impianti di Biomassa legnosa, a Birmingham un'installazione da 48 mln di sterline



MWH Global, azienda multidisciplinare che fornisce consulenza e servizi diingegneria nei settori dell'acqua, dell'energia, dell'ambiente e delle infrastrutture – è stata selezionata dalla Birmingham Bio-Power Ltd per la progettazione, costruzione e gestione di un nuovo impianto di gassificazione a biomassa da 10,3MW nei pressi di Birmingham. I lavori cominceranno subito e termineranno all'inizio del 2016.



L'impianto, del valore di 47,7 milioni di sterline, è il primo nel suo genere ad essere costruito nel Regno Unito. Sarà approvvigionato da circa 67.000 tonnellate di biomassa da legno, assicurata da un contratto a lungo termine con un fornitore locale. Si prevede che possa generare sufficiente energia elettrica da fonte rinnovabile per alimentare più di 17.000 abitazioni, creando un centinaio di posti di lavoro durante la costruzione dell’impianto e 19 posti di lavoro a tempo pieno per la sua gestione.



Durante i suoi 20 anni di vita, si prevede che l'impianto possa permettere di ridurre l'emissione di gas serra di circa 2,1 milioni di tonnellate e riutilizzare a fini energetici 1,3 milioni di tonnellate di biomassa da legno, che sarebbe altrimenti destinata in discarica.



L'energia è prodotta attraverso un processo di gassificazione della biomassa da legno. Il singas prodotto dalla gassificazione del legno viene bruciato in una caldaia per la produzione di vapore che viene poi alimentato da una turbina a vapore per la generazione di elettricità da immettere nella rete nazionale. La tecnologia di gassificazione è fornita dalla società canadese Nexterra Systems.



MWH è responsabile della progettazione e costruzione dell’impianto attraverso un contratto di tipo EPC (Engineer, procure, construct). MWH si occuperà anche della fase di messa in esercizio della centrale e della sua gestione tramite un contratto di Operations and maintenance di cinque anni.



Il consorzio che finanzia l’impianto, comprende la Green Investment Bank, Gravis Capital Partners, Balfour Beatty plc, Eternity Capital Management, Foresight Group’s UK Waste Resources e Energy Investments (UKWREI). "Si tratta di un grande riconoscimento per MWH", sottolinea Viviana Mariani, direttore tecnico e responsabile della divisione Energia e Infrastrutture di MWH in Sud Europa, "e un'ulteriore conferma delle grandi competenze tecniche che mettiamo a disposizione in materia di energie rinnovabili, biomasse, power generation in tutto il mondo. In Europa, in particolare, siamo sempre al fianco dei nostri clienti per contribuire al raggiungimento degli obiettivi della Direttiva 20-20-20 e da anni supportiamo investitori e sviluppatori nella realizzazione di impianti di energie rinnovabili e nell'implementazione di programmi di efficienza energetica. Per far fronte alle loro esigenze siamo in grado di offrire servizi che vanno dall'EPC alla consulenza e progettazione".



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Microspheres Improve Imaging

Microspheres Improve ImagingResearchers have discovered they can increase resolution when viewing a nanoscale object by placing transparent, dielectric, barium titanate glass microspheres on top of the object, while still using a conventional water or oil immersion objective. Augmenting the microscope in this way allows the use of standard equipment for the "super-resolution" of not just controlled nanostructured objects, but complex biological structures as well.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Nuclear Gets Nimble

Nuclear Gets NimbleChemistry World details a new trend in nuclear magnetic resonance (NMR) spectroscopy smaller, more nimble equipment. The shift is making NMR techniques available to more and more organizations, from pharmaceutical companies to food safety agencies

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Measuring Critical Dimensions with Ellipsometry

Measuring Critical Dimensions with EllipsometryThe SEMI E141 standard covers the typical applications of ellipsometry, which are the determination of layer thickness and optical properties of fabricated layers and the critical dimensions (CD) of submicron structures. The purpose of this standard is to provide a guide for a unique specification of the most commonly applied ellipsometer equipment, the comprised modules and components, and their spatial arrangement. The notation for parameters required in data acquisition and modeling is also specified.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Scientists get meaningful energy from laser-based nuclear fusion



Researchers have long sought to generate significant energy from laser-basednuclear fusion, and it appears that they're finally making some headway. Lawrence Livermore National Laboratory reports that laser blasts in September and November produced more energy from hydrogen fusion reactions than they'd put into the hydrogen -- the first time that's happened. The key was an extra dose of caution. The lab team altered the laser pulse so that it didn't break a shell used in the necessary fuel-compression process, improving the energy yield. We're still far from seeing laser fusion reactors when just 1 percent of the power reached the hydrogen in the first place. However, the output was much closer to what scientists have been expecting for years -- laser fusion is now more of a realistic possibility than a pipe dream.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Surprise! A Self-healing Metal





Engineers have been reading about development of self-healing polymers for years. Now they can be surprised, right alongside MIT scientists who discovered, quite unexpectedly, a metal that heals itself. The discovery of nickel alloys that, when put under stress , mend cracks could lead to materials that repair incipient damage before it can spread.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Special report: Oman goes solar



The need for more water and power is intrinsic to all arid countries of GCC, which are developing their economies and undergoing rapid industrialization with growing population. The story of the Sultanate of Oman is no different and having become the latest entrant to the solar bandwagon, the country is gearing up to harness the region’s most abundant resource.

Late last year, the Omani Rural Areas Electricity Company (RAECO) signed the first agreement to generate electricity from renewable energy resources, with US-based company Astonfield. The company, in cooperation with a local firm Multitech, will establish a pilot solar power plant in the state of Al Mazyunah in the Dhofar Governorate, which will begin commercial operation in mid-2014.

“The project is being realised thanks to the

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

“The project is being realised thanks to the leadership of the Government of Oman, and especially the Authority for Electricity Regulation (AER) and Rural Areas Electricity Company SAOC (RAECO). Astonfield has co-developed the project in close partnership with Bahwan Engineering Group through their investment arm Multitech LLC,” said Ameet Shah, co-founder and co-chairman of Astonfield.

The country of Oman is embracing solar power in a big way with a pilot solar plant in Al Mazyunah. Using both crystalline silicon and thin film technologies, the facility — set to begin operation in mid-2014 — will generate 558 MWh per year. Utilities-me.com says Oman is "uniquely positioned" at 23 degrees from the equator, making it ideal for testing solar generation technologies.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Researchers harness sun's energy during day for use at night



Solar energy has long been used as a clean alternative to fossil fuels such as coal and oil, but it could only be harnessed during the day when the sun's rays were strongest. Now researchers led by Tom Meyer at the Energy Frontier Research Center at the University of North Carolina at Chapel Hill have built a system that converts the sun's energy not into electricity but hydrogen fuel and stores it for later use, allowing us to power our devices long after the sun goes down.

"So called 'solar fuels' like hydrogen offer a solution to how to store energy for nighttime use by taking a cue from natural photosynthesis," said Meyer, Arey Distinguished Professor of Chemistry at UNC's College of Arts and Sciences. "Our new findings may provide a last major piece of a puzzle for a new way to store the sun's energy -- it could be a tipping point for a solar energy future."

In one hour, the sun puts out enough energy to power every vehicle, factory and device on the planet for an entire year. Solar panels can harness that energy to generate electricity during the day. But the problem with the sun is that it goes down at night -- and with it the ability to power our homes and cars. If solar energy is going to have a shot at being a clean source for powering the planet, scientists had to figure out how to store it for night-time use.

The new system designed by Meyer and colleagues at UNC and with Greg Parsons' group at North Carolina State University does exactly that. It is known as a dye-sensitized photoelectrosynthesis cell, or DSPEC, and it generates hydrogen fuel by using the sun's energy to split water into its component parts. After the split, hydrogen is sequestered and stored, while the byproduct, oxygen, is released into the air.

"But splitting water is extremely difficult to do," said Meyer. "You need to take four electrons away from two water molecules, transfer them somewhere else, and make hydrogen, and, once you have done that, keep the hydrogen and oxygen separated. How to design molecules capable of doing that is a really big challenge that we've begun to overcome."

Meyer had been investigating DSPECs for years at the Energy Frontier Research Center at UNC and before. His design has two basic components: a molecule and a nanoparticle. The molecule, called a chromophore-catalyst assembly, absorbs sunlight and then kick starts the catalyst to rip electrons away from water. The nanoparticle, to which thousands of chromophore-catalyst assemblies are tethered, is part of a film of nanoparticles that shuttles the electrons away to make the hydrogen fuel.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SECONDA PARTE

However, even with the best of attempts, the system always crashed because either the chromophore-catalyst assembly kept breaking away from the nanoparticles or because the electrons couldn't be shuttled away quickly enough to make hydrogen.

To solve both of these problems, Meyer turned to the Parsons group to use a technique that coated the nanoparticle, atom by atom, with a thin layer of a material called titanium dioxide. By using ultra-thin layers, the researchers found that the nanoparticle could carry away electrons far more rapidly than before, with the freed electrons available to make hydrogen. They also figured out how to build a protective coating that keeps the chromophore-catalyst assembly tethered firmly to the nanoparticle, ensuring that the assembly stayed on the surface.

With electrons flowing freely through the nanoparticle and the tether stabilized, Meyer's new system can turn the sun's energy into fuel while needing almost no external power to operate and releasing no greenhouse gases. What's more, the infrastructure to install these sunlight-to-fuel converters is in sight based on existing technology. A next target is to use the same approach to reduce carbon dioxide, a greenhouse gas, to a carbon-based fuel such as formate or methanol.

"When you talk about powering a planet with energy stored in batteries, it's just not practical," said Meyer. "It turns out that the most energy dense way to store energy is in the chemical bonds of molecules. And that's what we did -- we found an answer through chemistry."



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

This Graphene Nanoribbon Conducts Electricity Insanely Fast



You're looking a ribbon of graphene that's just one atom thick and fifteen atoms wide—and it could help shift data thousands of times faster than anything else currently can.

The nanoribbons, produced Felix Fischer, a chemist at Berkeley, are narrow—10,000 of them placed side by side would be about as wide as a human hair—and straight. That means that electrons can travel along them with no atoms to block their way, transporting current thousands of times faster than any other existing conductor—at least, over short distances. In turn, that means transistors can be switched on and off much faster, wildly increasing the speed of circuits.

Interestingly, the ribbons aren't sculpted out of graphene, but rather grown chemically, by heating rings of carbon and hydrogen so they slowly link, forming long daisy chains. Then, they're heated again to shift the hydrogen, leaving long ribbons of carbon-carbon bonds like those in the image above, which was captured using a scanning tunneling microscope.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

This is NASA's new giant crawler for its next-generation spaceship



Things keep moving at the Kennedy Space Center in preparation for the first mission of NASA's Space Launch System and its Orion spacecraft in 2017. The crawler-transporter just passed "the first phase of an important milestone test."

The Ground Systems Development and Operations Program completed testing of new traction roller bearings on crawler-transporter 2 (CT-2), on two of the massive vehicle's truck sections, A and C, in late January. The new roller bearing assemblies that were installed on one side of the crawler are visible in this Jan. 31, 2014 image. CT-2 returned to the Vehicle Assembly Building (VAB) at Kennedy Space Center, where work continues to install new roller bearing assemblies on the B and D truck sections.

These crawlers have been in use for 45 years, starting with the Saturn rockets for the Apollo missions to orbit and the Moon. Then they were retrofitted for the space shuttle. Now, the Crawler-Transported 2 has been upgraded to "to increase the lifted-load capacity from 12 million to 18 million pounds to support the weight of the mobile launcher and future launch vehicles, including the SLS and Orion."

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Sono ormai passati più di 40 anni da quando, nel 1971, venne installato all'Atkinson Morley Hospital di Londra il primo tomografo computerizzato per lo studio di patologie cerebrali. Allora si riteneva che l’applicazione di questa tecnica rimanesse confinata al solo studio del cranio. Rapidamente, invece, l’applicazione si è estesa a varie branchie della diagnostica medica per arrivare, oggi, ad un suo trasferimento, in continua espansione, alle applicazioni industriali.

Ebbene, oggi, la digitalizzazione completa e accurata di tutta la superficie di un pezzo, anche interna e non accessibile, è resa possibile anche in ambito industriale dalla tomografia computerizzata, che è in grado offrire numerosi vantaggi rispetto ad altri sistemi di misura dimensionale più tradizionali:
- misura tridimensionale e navigabile dell’intero oggetto,
- visualizzazione realistica e dettagliata della struttura interna ed esterna,
- controlli non distruttivi,
- misura di parti assemblate e superfici di contatto tra componenti diverse,
- analisi di prodotti in materiale composito e multi-materiale,
- tempi di scansione indipendenti dalla complessità della geometria.

Le applicazioni sono molteplici ed in continuo sviluppo e le prestazioni metrologiche, sempre più, si avvicinano a quelle tipiche della metrologia a coordinate tradizionale. Dall’altro lato, la possibilità di controllare particolari di dimensioni ragguardevoli e di spessori elevati è facilitata dalla potenza dei nuovi tomografi e dall’applicazione di raffinate tecniche di compensazione.


Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Vietnam, la scansione laser “incontra” gli antichi imperatori



CAM2 (Gruppo FARO Technologies), prestigioso fornitore di tecnologia di misura 3D, imaging e realizzazione, ha annunciato che SI2G S.r.l. ha utilizzato con risultati eccellenti un CAM2 Laser Scanner Focus3D nel progetto internazionale di recupero e valorizzazione della storica cittadella imperiale di Hué in Vietnam.

SI2G srl, Sistemi Informativi Intelligenti per la Geografia, studia il territorio e l'ambiente, a partire dall'informatica e dalla fotogrammetria. L’impresa si occupa di acquisire, analizzare, elaborare, archiviare e distribuire “dati ambientali” in formato digitale, con un approccio sistemico integrato e multidisciplinare, fornendo servizi di telerilevamento del territorio, fotogrammetria, topografia, cartografia e ICT.

Eva Savina Malinverni, Professore associato di Topografia dell’Università Politecnica, spiega come mai SI2G abbia di recente investito in un Laser Scanner Focus3D, innovativo strumento di scansione laser della CAM2, che permette rilevazioni 3D con grande precisione e semplicità: “Dopo tanti anni di esperienza cartografica acquisita lavorando con diversi strumenti geodetici, abbiamo deciso di ampliare le nostre conoscenze e le possibilità applicative in fatto di rilevazione, introducendo all’interno della nostra struttura un laser scanner”.

La scelta è caduta sul CAM2 Focus3D, strumento che abbina a un’elevata precisione di rilevazione una grande facilità d’uso. “Conoscevamo questo strumento per precedenti collaborazioni ed esperienze e ritenevamo che avrebbe potuto fare la differenza nella rilevazione di elementi architettonici, in particolare nei progetti di ricerca internazionali. Oggi possiamo dire di aver fatto la scelta giusta: il Laser Scanner Focus3D può infatti essere adoperato in maniera davvero semplice, come una normale fotocamera digitale. Possiamo portarlo con noi ovunque, anche in aree difficilmente raggiungibili o in Paesi del secondo o terzo mondo in cui sarebbe difficile giustificare, anche solo dal punto di vista burocratico, l’utilizzo di apparecchiature vistose”.

CAM2 Laser Scanner Focus3D è in effetti uno strumento compatto, leggerissimo (appena 5 kg) e con ingombro assai ridotto pari a 24 x 20 x 10 cm, che l’operatore può portare con sé sempre e ovunque. Inoltre, la tecnologia WLAN consente di avviare, arrestare, visualizzare o scaricare le scansioni a distanza.

Eva Savina Malinverni aggiunge: “Abbiamo acquistato il dispositivo lo scorso febbraio e in pochi mesi abbiamo preso dimestichezza e maturato la giusta esperienza. CAM2 Focus3D ci è stato di fondamentale aiuto soprattutto per due interessantissimi progetti che SI2G, in collaborazione con l’Università Politecnica delle Marche, ha svolto in Vietnam”. Si è trattato di attività svolte su iniziativa del Ministero degli Affari Esteri italiano (responsabile scientifico universitario il Prof. F. Pugnaloni) per la salvaguardia dei siti storici e architettonici nel mondo. “In particolare, ci siamo occupati delle carceri fortificate di Qu?ng Tri e della cittadella imperiale di Hué”.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

La città imperiale di Hué, dichiarata nel 1993 “Patrimonio dell’Umanità” dall’UNESCO, è probabilmente il sito architettonico più grande e celebre di tutto il Vietnam: da qui governarono gli imperatori della dinastia Nguyen tra il 1802 e il 1945. Essa è costruita sul modello del Palazzo Imperiale di Pechino e presenta mura, fossati, porte fortificate, ponti e decorazioni che la rendono un luogo suggestivo, di grande interesse artistico e storico.

“Tutta l’area – riprende la Professoressa – fu completamente distrutta durante la guerra del Vietnam, ed è ora in fase di restauro grazie alla sovvenzione di diversi sponsor internazionali. Noi, in particolare, abbiamo operato sulla cosiddetta “Porta Est”, elemento architettonico molto complesso costituito da decorazioni in maiolica e fregi di ogni tipo. La rilevazione sarebbe stata un’operazione davvero lunga e complicata se avessimo utilizzato le normali tecniche fotogrammetriche”.

CAM2 Laser Scanner Focus3D, invece, ha permesso all’equipe di SI2G di portare a termine il lavoro in poche ore e di ottenere, con sole 17 scansioni, risultati davvero sorprendenti: “Abbiamo rilevato la splendida Porta Est della città imperiale di Hu? in pochissimo tempo, acquisendo una griglia 3D costituita da milioni di punti, con una distanza di pochi millimetri l’uno dall’altro, che ci ha permesso di ricavare una mesh 3D con texture fotorealistica in cui è possibile leggere ogni dettaglio di forma e colore delle strutture originali”.

Hanno contribuito all’acquisizione e all’elaborazione dei dati anche il Prof. Fangi e i suoi collaboratori, gli Ingegneri Tassetti e Bozzi dell’Università Politecnica delle Marche.

Malinverni conclude: “Non va dimenticato che abbiamo lavorato in condizioni critiche, in una zona a elevato flusso turistico. Lo strumento non ha minimamente risentito della temperatura di quasi 40 °C, né dell’elevatissimo tasso di umidità, pari a circa l’85%. CAM2 Laser Scanner Focus3D si è rivelato uno strumento molto maneggevole, utilizzabile anche in luoghi “scomodi” e difficilmente accessibili, semplice nel suo utilizzo e molto versatile. I nostri colleghi stranieri in Vietnam ne sono rimasti impressionati”.




Marco La Rosa ha detto...

DA DOTT. COTELLESSA
Ispezione di interruttori con telecamere ultracompatte

La fabbrica Merlin Gerin di Alès utilizza sistemi di ispezione ad alta velocità per verificare le saldature dei pad di contatto degli interruttori, in uno spazio molto ristretto. Le ispezioni sono eseguite sulla parte di contatto elettrico negli interruttori C60, prodotti da Schneider Electric, che costituiscono una soluzione modulare capace di proteggere i circuiti negli edifici industriali e di servizi, utilizzati in oltre 100 Paesi.

"Controlliamo la presenza dei pad di contatto e ispezioniamo anche eventuali possibili difetti”, spiega Alain Antonin del Dipartimento Ingegneria di Produzione di Merlin Gerin Alès. “I pad offrono una superficie di contatto elettrico e l’ispezione assicura che essi permettano il corretto flusso della corrente. Il sistema di rilevamento deve essere preciso, perché cerchiamo difetti molto piccoli. In precedenza, non è stato possibile installare uno specifico sistema di illuminazione a causa della mancanza di spazio, pertanto sapevamo che avremmo avuto dei problemi a causa dei riflessi della luce. Abbiamo deciso di utilizzare un sistema di rilevamento a colori da 2 megapixel. Abbiamo impostato una soglia di pixel per il rilevamento dei pad in modo da garantire la perfetta qualità del nostro processo. Una lente da 50 mm per il target è posta a 100 mm dall’oggetto. Questo ci permette di rilevare difetti che misurano circa un decimo di millimetro”.

"Abbiamo consultato numerosi fornitori di sistemi di visione., scegliendo Keyence soprattutto per tre ragioni: in primo luogo, con il CV-5000, possiamo tenere sotto controllo il nostro processo. Il sistema è abbastanza semplice da permetterci di riprogrammarlo completamente per gestire un sistema di rilevamento senza dovere consultare un integratore o il fornitore. In secondo luogo, Keyence ci ha fornito una soluzione più conveniente e, non ultimo, la telecamera in miniatura fornita è stata l’unica a entrare nello spazio disponibile”, aggiunge Alain Antonin.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Sistema di visione coassiale per l'inseguimento di giunti e il monitoraggio di processi di saldatura laser



COAXIAL VISION SYSTEMS FOR JOINT TRACKING AND PROCESS MONITORING IN INDUSTRIAL LASER WELDING
We present an industrial measurement system developed through a collaboration between the Laboratory of Optoelectronics of the University of Brescia and Tube Tech Machinery srl, a company of the Province of Brescia leader in the development of laser-based welding and cutting machines.
A prototype will be described of a prototype that is based on artificial vision techniques, which yields a feedback of the welding quality, able to automatically handling the various phases of the laser welding process.



RIASSUNTO
In questo articolo viene presentato un sistema di misura industriale realizzato mediante la collaborazione tra il Laboratorio di Optoelettronica dell'Università degli Studi di Brescia e Tube Tech Machinery srl, azienda bresciana specializzata nella realizzazione di macchine per la saldatura e il taglio laser di tubi e lamiere.
In particolare, viene descritto il prototipo di un sistema integrato che sfrutta tecniche di visione artificiale in grado di offrire informazioni di feedback che consentono di gestire in modo automatico determinate fasi del processo di saldatura laser.

Il controllo in anello chiuso del processo di saldatura è un obiettivo che tanti stanno cercando di raggiungere, proprio perché tappa fondamentale che permette di aprire nuovi e rivoluzionari scenari nell’ambito della lavorazione dei metalli. Tanti sono i vantaggi, ma tra le più importanti esigenze vi è quella di poter garantire saldature sempre ottimali e conformi agli standard vigenti, effettuate in completa sicurezza e con tempistiche ridotte; il tutto, naturalmente, in modo completamente automatico.

Questo obiettivo si traduce nel realizzare sistemi di controllo che siano in grado di:
• rilevare la presenza ed inseguire la traiettoria di un giunto di saldatura, dove con il termine giunto si intende la zona di giunzione tra due superfici che devono essere saldate [1];
• controllare i parametri di saldatura basandosi su feedback restituiti dal processo in modo real-time. Tali parametri sono specifici del tipo di saldatura da effettuare e nel caso della saldatura laser questi sono relativi alla potenza del fascio emesso e alla sua focalizzazione.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SECONDA PARTE

Le tecniche normalmente prese in considerazione per questo scopo si basano su diversi principi di funzionamento. Tradizionalmente, relativamente all’inseguimento dei giunti, in letteratura vengono proposti sistemi a contatto, oppure sistemi basati sull'impiego di sensori a onde acustiche o sensori a campo magnetico. Per il controllo dei parametri di saldatura, vengono proposte tecniche basate sull'analisi delle emissioni del plasma di saldatura nel campo dell'ultravioletto [5] o dell'infrarosso [6], oppure sull'analisi delle emissioni acustiche [7], o ancora sull'analisi delle radiazioni laser riflesse [8]. Tutti questi sistemi sono semplici e robusti, ma anche poco flessibili, essendo in grado di monitorare solo aspetti specifici del processo.
Per ovviare a tale situazione, si sta oggi assistendo a un sempre maggior impiego di sistemi di visione, sia 2D sia 3D [9, 10, 11], per ottenere sistemi ad anello chiuso basati sul processing di immagini acquisite direttamente durante le operazioni di saldatura. L'impiego di sistemi di visione consente di realizzare applicazioni caratterizzate da un alto grado di flessibilità, poiché permettono di essere facilmente riconfigurate lasciando inalterata la componente hardware e modificando esclusivamente gli algoritmi software di image processing.

Presentiamo qui due sistemi di visione con funzionalità differenti, ma che condividono il medesimo apparato hardware. Il primo consiste in un sistema per l'inseguimento di giunti di saldatura di tipo butt mediante un setup di visione coassiale rispetto alla testa saldante. L’obiettivo è quello di misurare i parametri caratteristici del giunto di saldatura, ossia la sua posizione e la sua larghezza (gap), e di comunicarli un manipolatore robot, che può così seguire in modo ottimale la traiettoria di saldatura. Il secondo consiste in un sistema di visione industriale per il monitoraggio del processo.

È composto da una telecamera Basler GigaEthernet ACE640-100GM, da un PC di supervisione, da un dispositivo di acquisizione NI-Compact-RIO modello 9075, da un controller robot ABB-IRC5 e da un manipolatore antropomorfo ABB, modello IRB 4600-60/2.05.
Per consentire l’inquadratura ottimale dell'area di saldatura, la telecamera è stata installata a bordo di una testa ottica Precitec “YW-52” montata sul robot. Tale dispositivo ha una geometria che le consente di fornire un percorso ottico al fascio laser saldante, nonché di permettere alla telecamera di acquisire immagini lungo una direzione coassiale rispetto al fascio.
Per rendere le immagini acquisite immuni dal rumore e dai disturbi derivanti dalla radiazione emessa durante il processo di saldatura è stato utilizzato un illuminatore modello Cavilux HF. Nell'applicazione in esame, il laser utilizzato per la saldatura è un laser a Nd:YAG alla lunghezza d'onda di 1064 nm; l'illuminatore opera invece alla lunghezza d'onda di 690 nm; applicando un filtro a 960 nm all'ottica della telecamera, è così possibile eliminare qualsiasi influenza proveniente dalla luce del laser saldante.


Marco La Rosa ha detto...

DA DOTT. COTELLESSA

L’energia solare spinta al massimo

Per chi ancora fosse scettico sulla reale utilizzabilità delle fonti di energia rinnovabile su larga scala, è arrivato il momento di ricredersi. Ha infatti ufficialmente iniziato la produzione di energia elettrica la mega centrale solare di Ivanpah, in California.

Costruita su una superficie di 3500 acri, quasi 2000 campi da calcio per intenderci, fornirà energia elettrica a oltre 140.000 abitazioni. Alla sua realizzazione hanno contribuito 2100 lavoratori, contribuendo quindi a creare molti posti di lavoro in un periodo di crisi, che non ha risparmiato neppure gli Stati Uniti; fra i principali soci, figura non a caso Google, da sempre molto attenta a investimenti promettenti e alternativi: in questo caso, la cifra complessiva per realizzare questa meraviglia tecnologica è pari a circa 2 miliardi di euro.

I 350.000 specchi che compongono l’impianto sono continuamente orientati da un computer per ottenere la massima quantità possibile di raggi solari; questa enorme energia viene fatta confluire su una torre centrale, che tramite acqua portata a temperatura altissima, muove una turbina in grado di produrre quasi 400 megawatt di corrente elettrica. Archimede sarebbe molto fiero di questa evoluzione dei suoi specchi ustori!

Nonostante il costo per kilowatt di questa centrale sia ancora decisamente superiore a quello di centrali a carbone o nucleari, non si deve sottovalutare l’impatto ambientale immensamente inferiore, per non parlare dell’assenza di rischi dovuti ad incidenti che, per quanto rari, possono essere così catastrofici da restare nella memoria collettiva per decenni. Ricordiamo anche che, a titolo di esempio, il tempo medio di smaltimento di una centrale nucleare è di ALMENO 100 anni. E’ evidente che ancora i tempi non sono maturi per abbandonare i combustibili fossili, ma vedere grandi aziende investire in energie alternative fa ben sperare per il futuro: solo grazie a cospicui investimenti in ricerca, potremo sperare di affrancarci dalla dipendenza da fonti energetiche inquinanti e comunque destinate ad esaurirsi.

Vi chiederete quanti decenni di lavoro ha richiesto, un’opera così titanica: 3, dicasi tre anni. Ebbene si, in altri Paesi le cose funzionano davvero. Già immagino la stessa cosa fatta da noi: cinque anni di progettazione, altrettanti di appalti, una ventina di processi per corruzione e frodi varie, quatto diverse inaugurazioni, tutte rigorosamente PRIMA della fine dell’opera. Costo iniziale previsto che alla fine come minimo quadruplica, e così via… Qualcuno ancora si stupisce che nessuna azienda seria voglia investire in Italia?

Almeno rallegriamoci del continuo sviluppo che altrove hanno le energie alternative, e speriamo presto di poterne anche noi godere i frutti






Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Test sulla fusione nucleare riuscito negli USA



L’energia delle stelle, ovverosia la fusione nucleare che riproduce le reazioni che avvengono negli astri, potrà essere l’asse portante per lo sviluppo sostenibile del nostro pianeta essendo un’energia sicura e compatibile con l’ambiente.

In base a quanto pubblicato dalla rivista Nature, alla National Ignition Facility (NIF) negli USA, presso il Lawrence Livermore National Laboratory, per la prima volta si è riusciti a produrre una quantità di energia superiore a quella necessaria ad innescare la reazione ed è un passo importante verso la dimostrazione della fattibilità scientifica della fusione tramite confinamento inerziale

Da sottolineare, però, che questi risultati, ancorché importanti per lo sviluppo della fusione inerziale, sono inferiori a quelli già ottenuti con la fusione a confinamento magnetico. In particolare, se si considera tutta l’energia in gioco il rapporto tra quanto ottenuto e quanto speso, nell’esperimento di fusione inerziale è dell’ordine del 1%. Per confronto, con il confinamento magnetico, nell’esperimento JET si è ottenuto un rapporto significativamente più elevato. La strada del confinamento magnetico resta per l’Italia e l’Europa la strada maestra per ottenere l’energia da fusione, un settore dove l’Italia grazie al coordinamento ENEA è leader riconosciuta sia a livello scientifico e tecnologico che a livello industriale come dimostrano le commesse acquisite per la costruzione di ITER che ammontano al 53% del valore assegnato finora.

Presso il Centro Ricerche di Frascati dell’ENEA è in funzione un impianto sperimentale denominato ABC che, grazie a due laser, ognuno da 100J, concentrati su un bersaglio, permette di effettuare studi sulla focalizzazione dei fasci e sullo sviluppo di modelli teorici per il confinamento inerziale. Inoltre l’ENEA partecipa al progetto europeo HiPER che si propone di realizzare un esperimento che produca molta più energia di quella prodotta nell’esperimento americano usando laser più piccoli e a costi molto inferiori.

Anche se il traguardo per ottenere l’energia da fusione non può dirsi ancora vicino, questo risultato sperimentale dimostra come le ricerche in questo campo progrediscano costantemente nel tempo.


Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Gli effetti degli impianti eolici sul clima: uno studio a cui ha collaborato l’ENEA minimizza l’impatto a livello europeo



Il rapido sviluppo dell’energia eolica ha sollevato preoccupazioni sul suo impatto ambientale. Diversi studi recenti avevano infatti mostrato come la presenza di grandi parchi eolici potesse modificare la circolazione atmosferica, assieme a temperatura e precipitazioni. Inoltre, nei pressi di parchi eolici è stato osservato un aumento significativo della temperatura, in particolare durante la notte, quando la turbolenza prodotta dai parchi impedisce la creazione di strati di aria fredda vicino al suolo.

In realtà, tali effetti sono molto limitati, ha rilevato uno studio pubblicato oggi da Nature Communications e condotto da ricercatori del CEA (Ente francese per l’energia atomica e le energie alternative), del CNRS (Centro nazionale della ricerca scientifica, la più grande organizzazione pubblica del genere in Francia) e dell’Università di Versailles, in collaborazione con ENEA e INERIS (l’Istituto nazionale che si occupa di impatto ambientale e dei rischi derivanti dal settore industriale in Francia).

A questa conclusione si è arrivati utilizzando modelli climatici regionali in Europa che includono gli effetti di impianti eolici attualmente in servizio e quelli previsti nei prossimi 20 anni.

Si è trattato del primo studio del genere a livello europeo che ha quantificato in uno scenario realistico gli effetti sul clima derivanti dall’energia eolica, la cui produzione nel nostro continente dovrebbe raddoppiare da qui al 2020. Questo studio confronta delle simulazioni climatiche fatte con e senza la presenza al suolo dei parchi eolici e mostra differenze medie di temperatura molto piccole, attorno a 0,3°C, con differenze significative solo in inverno. Lo studio mostra come queste differenze siano dovute in parte al sovrapporsi di effetti locali nella regione più interessata dalla presenza di parchi eolici e una lieve rotazione del vento proveniente da ovest.


Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Using LabVIEW in a Critical Laser Application for the National Ignition Facility at Lawrence Livermore National Laboratory



"A team of three people prototyped, developed, and deployed the final version of the application in about 15 months, which is roughly one-third of the estimated time required to develop the application using Java or C++. "



La sfida:
Creating an automated maintenance process in the Optic Mitigation Facility (OMF) of the National Ignition Facility (NIF) that uses lasers, vision systems, motorized positioners, and diagnostic instruments to repair damaged sites on the surface of precision optical lenses, thereby extending the life span of the optic and ensuring cost-efficient operation of the multibillion dollar machine. The high-energy lasers of the NIF can initiate tiny damage sites on the surface of precision optical lenses, which could eventually render the optic unserviceable if left unattended. The OMF is critical to the success of NIF because the laser cannot operate without readily available lenses.

La soluzione:
Using readily available instrument drivers and control interfaces in NI LabVIEW software to control the inspection and laser-based repair processes for the lenses. Due to the critical nature of the OMF application, software engineering practices and extensive testing were applied to ensure reliable and safe operation of the software and the equipment.


Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Using CompactRIO and LabVIEW to Monitor and Control a Compact Spherical Tokamak for Plasma Research



"Overall, NI products simplified the whole setup. We quickly brought together hardware and software from many manufacturers into an incredibly compact, powerful, and cost-effective tokamak."

La sfida:
Creating a small, cost-effective tokamak for the approximately 300 plasma research centers in the world to use for exploring magnetic confinement fusion.

La soluzione:
Using NI CompactRIO hardware and NI LabVIEW software to develop a powerful DAQ and control system for a small tokamak, providing plasma physics research centers easier access to the technology they need to accelerate their research, with less expense.


Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Controlling the World’s Most Powerful Laser with NI LabVIEW and PXI Instruments



"Because of its power, flexibility, and ease of use, we use LabVIEW with PXI controllers to manage several hundred control points and different instruments within the laser system."



La sfida:
Building and accurately controlling a laser that delivers pulses of extremely high power in the petawatt range to conduct high-energy density physics research such as the study of particle fusion and the behavior of matter in extreme conditions.



La soluzione:
Using NI LabVIEW software and PXI instruments to precisely control the charging, firing, amplification, and targeting of the world’s most powerful operating laser.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Laser Brings Broadband to Space



High-speed communications to the moon, Mars, and beyond could be as easy as turning on a laser, according to the results of NASA's 30-day Lunar Laser Communication Demonstration (LLCD). Carried to lunar orbit aboard the Lunar Atmosphere and Dust Environment Explorer (LADEE), the LLCD demonstrated upload speeds of 22 Mb/s and download speeds of 622 Mb/s. Equally as important, the system achieved error-free operation even under adverse conditions.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Job Site Surveillance by Drone

Surveying construction project progress has not only entered into the digital age, it's also taken flight in the high-tech world of drone technology. New drones are said to be engineered with custom-designed cameras for mapping, plus easy-to-use joystick tech, not unlike a video game. Plus, they gotta be a real blast to operate. Hop on board this video demo.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Eolico, ora le turbine lampeggiano solo se necessario



Grazie a un nuovo sistema a sensori ideato dal Fraunhofer Institute sarà possibile eliminare una dei principali motivi di lamentela dei "vicini di casa" delle turbine eoliche.

Con la crescita esponenziale dell'eolico in Germania, sono aumentate le lamentele dei "vicini" per il rumore provocato dalle turbine, ma non solo: queste ultime, infatti, sono dotate di un sistema di luci lampeggianti che sono accese durante la notte e in caso di nebbia, per avvertire i piloti degli aerei che volano a bassa quota. La luce, di colore rosso, pare che abbia provocato diverse lamentale da parte degli abitanti nei pressi dei parchi eolici.

Per questo motivo il Fraunhofer Institute for High Frequency Physics and Radar Techniques ha ideato un sistema a sensori chiamato Parasol che consente l'accensione delle luci solo in presenza di velivoli nei pressi degli impianti, riducendo così il disturbo al minimo, ma garantendo comunque la sicurezza degli aeroplani.

Le luci usano un sistema radar passivo che non emette direttamente il proprio raggio di luce. I sensori, infatti, impiegano le frequenze radio locali per individuare gli aeroplani. I trasmettitori radio mandano dei segnali che vengono riflessi dagli oggetti localizzati in aria, in questo modo il sistema attraverso un algoritmo matematico determina la distanza, la posizione e la velocità del velivolo. I sensori sono localizzati sui pali delle turbine mentre una CPU centralizzata all'interno dell'impianto processa tutti i dati raccolti.

Le luci di emergenza si accendono quindi solo se l'aeroplano vola a una distanza di quattro chilometri dalle turbine e sotto un'altitudine di 700 metri. Parasol possiede anche un ulteriore vantaggio: il sistema, infatti, non ha un modulo trasmettitore, per il quale è necessario pagare una licenza; per questo motivo risulta più conveniente dal punto di vista economico.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Chemical Imaging Unmasks Cancer Tissue



For years, scientists have proposed using mass spectrometry imaging (MSI) to identify tissue types, but no effective method has been devised. Now, researchers at Imperial College London have developed a technique for processing MSI data and building a database of tissue types. Learn how a single test taking a few hours can provide much more detailed information than standard histological tests, revealing if a tissue is cancerous and the specific type.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Plastic Bottles Tackle Skin Disease



Here's another interesting application for recycled PET bottles: a potentially powerful agent to prevent and treat fungus-induced diseases like skin infections. Researchers from IBM and Singapore's Institute of Bioengineering and Nanotechnology (IBN) have successfully converted PET into a non-toxic, biocompatible material. Made up of small molecule compounds that self-assemble in water into nanofibers, the material targets and kills fungal cells via electrostatic interaction.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Probing the Secrets of a Childhood Virus



A new imaging technology could give scientists a better understanding of respiratory syncytial virus (RSV), which can lead to pneumonia and bronchitis in children. Researchers from Vanderbilt and Emory universities used a probe technology with multiple fluorophores that indicate the presence of viral RNA and trace the spread of infection. The technique, which featured an optical microscope and dark-field illumination, could lead to new antiviral drugs and even a vaccine to prevent RSV infections.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

New Tool Charts RNA Changes

In his research on genetic diseases, University of Carolina chemist Qi Zhang has come up with an important new technique to visualize the shape and motion of RNA at the atomic level. Using nuclear magnetic resonance (NMR) spectroscopy, Zhang can follow structural changes in RNA over real time, from hundredths to tenths of a second. That's a key factor in designing drugs that will bind to the nucleic acid at exactly the right moment.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Artificial leaf jumps developmental hurdle



In a recent early online edition of Nature Chemistry, ASU scientists, along with colleagues at Argonne National Laboratory, have reported advances toward perfecting a functional artificial leaf.

Designing an artificial leaf that uses solar energy to convert water cheaply and efficiently into hydrogen and oxygen is one of the goals of BISfuel – the Energy Frontier Research Center, funded by the Department of Energy, in the Department of Chemistry and Biochemistry at Arizona State University.

Hydrogen is an important fuel in itself and serves as an indispensible reagent for the production of light hydrocarbon fuels from heavy petroleum feed stocks. Society requires a renewable source of fuel that is widely distributed, abundant, inexpensive and environmentally clean.

Society needs cheap hydrogen.

“Initially, our artificial leaf did not work very well, and our diagnostic studies on why indicated that a step where a fast chemical reaction had to interact with a slow chemical reaction was not efficient,” said ASU chemistry professor Thomas Moore. “The fast one is the step where light energy is converted to chemical energy, and the slow one is the step where the chemical energy is used to convert water into its elements viz. hydrogen and oxygen.”

The researchers took a closer look at how nature had overcome a related problem in the part of the photosynthetic process where water is oxidized to yield oxygen.

“We looked in detail and found that nature had used an intermediate step,” said Moore. “This intermediate step involved a relay for electrons in which one half of the relay interacted with the fast step in an optimal way to satisfy it, and the other half of the relay then had time to do the slow step of water oxidation in an efficient way.”

They then designed an artificial relay based on the natural one and were rewarded with a major improvement.

Seeking to understand what they had achieved, the team then looked in detail at the atomic level to figure out how this might work. They used X-ray crystallography and optical and magnetic resonance spectroscopy techniques to determine the local electromagnetic environment of the electrons and protons participating in the relay, and with the help of theory (proton coupled electron transfer mechanism), identified a unique structural feature of the relay. This was an unusually short bond between a hydrogen atom and a nitrogen atom that facilitates the correct working of the relay.

They also found subtle magnetic features of the electronic structure of the artificial relay that mirrored those found in the natural system.

Not only has the artificial system been improved, but the team understands better how the natural system works. This will be important as scientists develop the artificial leaf approach to sustainably harnessing the solar energy needed to provide the food, fuel and fiber that human needs are increasingly demanding.

ASU chemistry professors involved in this specific project include Thomas Moore, Devens Gust, Ana Moore and Vladimiro Mujica. The department is a unit of the College of Liberal Arts and Sciences. Key collaborators in this work are Oleg Poluektov and Tijana Rajh from Argonne National Laboratory.

This work would not have been possible without the participation of many scientists driven by a common goal and coordinated by a program such as the Energy Frontier Research Center to bring the right combination of high-level skills to the research table.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Digital Radiology Obsoletes X-ray Films



The venerable light box, used for decades for examining X-ray films, is on its way to becoming obsolete as digital radiology becomes prevalent. Thanks to advances in high resolution, high-speed analog-to-digital converters (ADCs), digital radiology is making strong inroads into the X-ray market, according to a report by IHS. Key benefits include elimination of films and chemicals, nearly instantaneous availability for review, lower X-ray exposure, enhanced images, easy storage and distribution, and reduction in space used.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Incentivi su ricerca e sviluppo: un'occasione persa per favorire crescita e sviluppo



Un recente studio condotto congiuntamente dall'Ufficio europeo dei brevetti e dall'Ufficio per l'armonizzazione nel mercato interno ed intitolato "Intellectual Property Rights intensive industries:contribution to economic performance and employment in Europe" (i dettagli sono consultabili in sintesi in questo approfondimento "Diritti di proprietà intellettuale: circa il 35% dei posti di lavoro nell'UE è collegato a industrie ampiamente basate sulla proprietà intellettuale"), ha calcolato che circa il 39% dell'attività economica complessiva dell'Unione europea (pari indicativamente a 4 700 miliardi di euro all'anno) ruota attorno a industrie ampiamente basate sui diritti di proprietà intellettuale, le quali generano direttamente circa il 26% di tutti i posti di lavoro nell'UE a cui si somma un altro 9% derivante dall'indotto. In totale, quindi, oltre 75 milioni di posti di lavoro nella UE sono legati, direttamente o indirettamente, ad aziende che investono pesantemente in ricerca e nella tutela dei propri diritti di proprietà intellettuale.
Anche una valutazione superficiale di queste cifre evidenzia quindi il ruolo centrale che gli investimenti in ricerca e sviluppo assumono sia nel contesto del quadro macroeconomico generale che ai fini della crescita occupazionale.
Molti paesi europei e non solo, nel recente passato, hanno adottato legislazioni volte a favorire gli investimenti in R&D ed ad attrarre verso le loro giurisdizioni investitori stranieri attivi in settori industriali fortemente orientati nella ricerca.
Il Canada prevede una deduzione immediata per tutti i costi in conto capitale e un credito d'imposta del 15% delle spese in R&D utilizzabile per tutti i debiti fiscali e riportabile in avanti per 20 anni e indietro 3 anni, inoltre in molte giurisdizioni canadesi si applicano anche crediti provinciali per la R&D (compresi tra il 4,5% e il 37,5%). La Francia offre la deducibilità delle spese nell'anno in cui sono sostenute e un credito d'imposta (CIR) del 30% sui primi 100 milioni di euro spesi più un ulteriore 5% oltre i 100 milioni. Tale credito è rimborsabile se non è utilizzato entro il triennio. Nel Regno Unito è prevista una super deduzione dei costi in R&D del 130% per le grandi imprese e 225% per le Pmi, o in alternativa un credito di imposta del 10% per le prime e per le Pmi in perdita sale al 24,5% riportabile senza limite temporale (oltre ad una tassazione vantaggiosa delle royalties). Gli Stati Uniti, che avevano un programma di crediti fiscali sino al 31 dicembre 2013 non lo hanno sinora rinnovato, ma forti sono le spinte perché venga fatta marcia indietro e si reintroducano con effetto retroattivo al 01 gennaio 2014 i benefici. Ma la lista dei paesi, anche emergenti, che offrono vantaggi fiscali importanti a che investe in ricerca è molto lunga: dalla Cina al Brasile guardando oltre i confini europei, a Spagna, Olanda, Portogallo, Croazia per stare in Paesi dell'area Euro molto vicini all'Italia.
In Italia, gli investimenti in R&D sono stati agevolati, a partire dall'anno 2007, tramite la concessione di crediti di imposta rapportati alla misura degli investimenti stessi, diversamente calcolati nel susseguirsi delle manovre agevolative. Per gli investimenti in R&D nel triennio 2014-2016 è attualmente previsto (dal Decreto Legge n. 145 del 23 Dicembre 2013) che il credito di imposta spetti nella misura del 50% dell'incremento annuale di spesa rispetto all'anno precedente (con un tetto massimo di incremento di 5 milioni di euro per anno), fruibile esclusivamente in compensazione su ogni imposta o contributo da versare a mezzo del modello F24.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SECONDA PARTE

A partire dal 2012 ulteriori agevolazioni sono state introdotte, con lo scopo di favorire la nascita di imprese tecnologicamente evolute, a favore delle cd. "Start up innovative". Queste possono fruire di un credito d'imposta del 35%, con un limite massimo pari a 200 mila euro annui, del costo aziendale sostenuto per le assunzioni a tempo indeterminato, e attraverso contratti di apprendistato, di personale altamente qualificato. Inoltre tale credito d'imposta è concesso in via prioritaria alle Start up rispetto alle altre imprese. Inoltre, in materia di "Start up" sono state previste delle deduzioni e detrazioni a favore dei soggetti che intendono investire in questi settori. In particolare, le persone fisiche possono detrarre dall'imposta lorda un importo pari al 19% della somma investita nel capitale sociale, fino a 500mila euro, in ciascun periodo d'imposta interessato dal decreto (2013-2014-2015) mentre i soggetti Ires possono dedurre dal reddito un importo pari al 20% delle stesse somme, fino a 1,8 milioni di euro, in ciascun periodo d'imposta. Le eventuali eccedenze, sia per la detrazione sia per la deduzione, potranno essere riportate in avanti nei periodi d'imposta successivi, ma non oltre il terzo. I benefici sono incrementati (al 25% nel caso di persone fisiche e al 27% per le società) se si investe in startup a «vocazione sociale» o che sviluppano e commercializzano esclusivamente prodotti.
Un'atra agevolazione riguarda la deducibilità ai fini Irap, essendo deducibili ai fini della determinazione dell'imponibile Irap i costi sostenuti per il personale addetto alla ricerca e sviluppo, a condizione che "l'attestazione di effettività" degli stessi costi sia rilasciata dal presidente del collegio sindacale ovvero, in mancanza, da un revisore dei conti o da un professionista iscritto negli albi dei revisori dei conti, dei dottori commercialisti, dei ragionieri e periti commerciali o dei consulenti del lavoro.
Per quanto le agevolazioni concesse alle nuove Start up innovative possano stimolare l'imprenditoria giovanile promuovendo investimenti in nuovi progetti, il quadro italiano resta in generale poco incentivante e poco competitivo rispetto a quanto offrono gli altri Stati. Forse per questo motivo, il Consiglio dei Ministri ha approvato il 6 Febbraio 2014 un piano ("Ricerca e innovazione nelle imprese – Misure di sostegno immediato alle attività innovative di ricerca delle imprese") che prevede nuovi incentivi, tra cui contributi a fondo perduto del 60% per ricerca e sviluppo, aumento del credito d'imposta dal 35% al 75% per le assunzioni di personale qualificato nel Mezzogiorno, "voucher" per l'internazionalizzazione e per consulenze connesse ad attività di ricerca e sviluppo, per un totale di 250 milioni di euro a disposizione per il 2014. Ma, quali garanzie che in questi tempi di incertezze normative questo piano venga confermato ed addirittura le misure esistenti non mutino nel tempo?
In un quadro di scarsa incentivazione ed incertezza normativa, le società già internazionalizzate tenderanno probabilmente a mantenere le attività di R&D all'estero, mentre sembra lontana la possibilità che investimenti stranieri possano essere attratti nel nostro Paese. Allo stesso modo le società italiane non ancora internazionalizzate potrebbero essere spinte ad investire all'estero piuttosto che in Italia. Appare evidente quindi che l'Italia stia perdendo l'opportunità della crescita della ricerca e sviluppo, sia a livello economico che occupazionale, accentuando l'emorragia di investimenti italiani in R&D e la delocalizzazione.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Forget Wheels, This Robot Walks Like a Human



Walking seems like the easiest thing in the world, right? However, it’s a very complicated process, involving hundreds of factors that must work together seamlessly. Researchers at the Texas A&M Bipedal Experimental Robotics (AMBER) Lab are studying human walking mechanisms in order to develop the next generation of robotic systems, from prosthetic devices to legged robots for space exploration.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

UV Luminescence and Glue Application Handbook



The UVX 300 luminescence sensor was designed to effectively detect UV luminescent materials and markers. The UV light source in the UVX 300 is directed towards a target and the visible light is reflected from the target back to the UVX300. Since the UV luminescent materials and inks are invisible under a normal light spectrum, it is possible to mark products and parts without affecting their appearance.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Al via la sperimentazione clinica europea sulle staminali "ripara cuore"



E' partita una sperimentazione che usa le staminali per curare i pazienti che hanno avuto un infarto. Lo studio, finanziato dallaCommissione europea, si chiama Bami, (Bone Marrow Cells in Acute Myocardial Infarction) e ha l'obiettivo di scoprire se le staminali riescono a ridurre di un quarto le morti dopo il malore e a ridurre lo scompenso cardiaco dei pazienti.

La procedura - Le cellule staminali vengono prelevate dal midollo osseo degli stessi malati e poi iniettate nel cuore. Saranno coinvolti 3 mila pazienti di undici Paesi europei, tra cui l'Italia. I risultati del test clinico saranno resi pubblici tra cinque anni. Non è il primo trial clinico di questo tipo ma finora le sperimentazioni hanno coinvolto pochi volontari.

Il test includerà pazienti reduci da pochi giorni da un infarto. A tutti verrà applicato uno stent per allargare le arterie che ossigenano il cuore. Ma solo a metà campione saranno prelevate le staminali del midollo osseo per poi iniettarle nel cuore. L'idea di base è che queste staminali aiutino il processo autoriparativo del cuore infartuato, contribuendo a minimizzare il danno cardiaco e a ridurre la porzione dell'organo irrimediabilmente danneggiata.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Meno raggi X per i bimbi con mappatura 3D del cuore



Meno raggi X per i bambini malati di cuore grazie a una nuova tecnica di mappatura tridimensionale. La metodologia, sperimentata all'ospedale Bambino Gesù nelle sedi di Palidoro e Santa Marinella, consente ai i piccoli di sottrarsi ai rischi derivanti dall'utilizzo delle radiazioni da fluoroscopia.

Gli interventi pilota - I primi test di successo, in due interventi di ablazione transcatetere portati a termine dall'équipe guidata da Fabrizio Drago. I due bambini erano affetti da tachicardia parossistica sopraventricolare, una forma di aritmia che fa accelerare il cuore improvvisamente determinando palpitazioni e malessere generale. L'intervento è perfettamente riuscito.

Pericoli ridotti - Drago commenta: “Questa nuova tecnologia avrà un impatto terapeutico imponente, perché renderà più semplice il mappaggio delle camere cardiache anche per l'operatore meno esperto, riducendo al massimo il tempo di esposizione del bambino ai raggi x, potenzialmente pericolosi per la salute".

Mappa istantanea - Le immagini del cuore, acquisite in meno di un secondo, all'inizio della procedura interventistica, si integrano con la ricostruzione tridimensionale del cuore determinata dal contatto sulla sua parete interna del catetere mappante. Sul monitor del nuovo apparecchio 3D, a disposizione dell'ospedale, il catetere mappante è rappresentato con un'animazione e naviga all'interno dell'immagine fluoroscopica del cuore, registrata precedentemente con precisione millimetrica rispetto ai movimenti eseguiti dall'operatore.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Creati i super muscoli artificiali: diventare Robocop non è più un sogno



Un filo di pesca e quello per cucire: due normali materiali dai quali i ricercatori americani dell'università del Texas a Dallas sono riusciti a creare dei muscoli artificiali "superpotenti". Grazie a questi, montati su esoscheletri, potremo sollevare carichi cento volte maggiori rispetto a quelli che può sollevare un muscolo umano della stessa lunghezza e peso. La ricerca è stata condotta dal team coordinato da Carter Haines e pubblicata su Science.

Questi muscoli potrebbero essere utilizzati per conferire più destrezza, forza e capacità espressiva ai robot umanoidi o anche per finestre intelligenti che si aprono e si chiudono da sole in risposta alle variazioni della temperatura ambientale. I muscoli si azionano con le variazioni di temperatura e sono realizzati torcendo i fili polimerici più sottili di un capello umano. Rispetto ai muscoli naturali, che si contraggono solo per circa il 20% della loro lunghezza, questi muscoli possono contrarsi fino a circa il 50%.

Molto vaste le opportunità di applicazione, come sottolinea uno degli autori, Ray Baughman, dell'università del Texas. "I robot umanoidi più avanzati di oggi, e anche gli esoscheletri indossabili - dice Baughman - sono limitati dagli ingombranti e pesanti motori e dai sistemi idraulici che limitano la destrezza, la forza e la capacità di lavoro".

Composti da più fasci a spirale, i muscoli artificiali potrebbero conferire espressioni facciali più realistiche ai robot umanoidi di compagnia per gli anziani oppure fornire maggiore abilità ai sistemi per la microchirurgia mini-invasiva robotica. Inoltre, potrebbero essere usati nei dispositivi per comunicare il senso del tatto alle mani robotiche.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Honeywell to test GX Aviation on Air China aircraft



Air China has become the first airline to partner with Honeywell Aerospace and Inmarsat for the testing of GX Aviation on its aircraft.

As part of the move, Honeywell has entered into a memorandum of understanding (MoU) with the airline to test the GX Ka-band connectivity on the A330 aircraft.

Joint testing is expected to begin in the second quarter of 2015.

"Passengers are expected to experience a 60% improvement in download speed for their in-flight connectivity, compared with current solutions in the market."

GX Aviation is the latest technology in cabin connectivity for commercial aircraft, and will provide Air China passengers with home and office-like wireless connectivity service.

With the new service, passengers are expected to experience a 60% improvement in download speed for their in-flight connectivity, compared with current solutions in the market.

Honeywell Aerospace Airlines Asia Pacific vice-president Brian Davis said: "GX Aviation will undoubtedly establish new global benchmarks for passenger standards for the next two decades. True worldwide coverage and highest speeds are just some of the benefits that Air China can look forward to enjoying once the service comes into play in 2015."

Honeywell and Inmarsat signed an agreement in 2012 to deliver global in-flight connectivity services to business, commercial and government aviation customers across the world.

Under the agreement, Honeywell will develop, produce and distribute the on-board hardware that will allow users to connect to Inmarsat's Global Xpress constellation network.

Inmarsat Aviation vice-president Bill Peltola said: "This testing programme with Air China is further confirmation that everything is on track for GX Aviation to be available globally for all types of aircraft from the get-go."

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Nuova tecnologia olografica 3D





L’ologramma è la rappresentazione tridimensionale di un oggetto, contenente informazioni su forma, dimensioni, luminosità e contrasto. Tale informazione è contenuta nella registrazione della figura d’interferenza di due onde luminose coerenti chiamate in temine tecnico riferimento e oggetto. L’interferenza delle due onde è resa possibile dalla proprietà di coerenza della luce emessa in fase di registrazione da un laser.

Illuminando un ologramma con un laser si decodifica l’informazione contenuta nella figura d’interferenza, ricreando l’originale fronte d’onda dell’oggetto. La differenza sostanziale fra una fotografia e un ologramma consiste nel fatto che, mentre la prima di un oggetto registra soltanto un punto di vista, la seconda invece ne registra una infinità.

Finora era possibile ottenere solo ologrammi statici, ma i ricercatori dell’Università dell’Arizona hanno sviluppato un nuovo sistema che permette la visione di immagini tridimensionali in movimento senza l’ausilio di occhiali 3D. L’elemento centrale del sistema è uno schermo realizzato con un nuovo polimero fotorifrattivo con cui si può fare refresh dell’ologramma ogni due secondi, da cui una proiezione quasi in tempo reale. La registrazione prevede un rete di normali telecamere, con ciascuna che inquadra il soggetto da una prospettiva diversa. Tanto più numerose sono le telecamere, tanto più accurata è la rappresentazione olografica finale.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Lo smartphone HP farà ologrammi come in Guerre Stellari

Ricercatori HP hanno creato uno schermo che può produrre immagini olografiche tridimensionali. È possibile grazie a una struttura nanometrica che può deviare la luce emessa dallo schermo stesso. Si potrebbe aggiungere agli smartphone in tempi brevi.

HP ha realizzato uno schermo capace di creare ologrammi tridimensionali, grazie alla modifica di un comune schermo LCD. I ricercatori in forza all'azienda statunitense sono riusciti a ottenere immagini e video ai quali si può "girare intorno" come se fossero oggetti fisici. L'effetto ottenuto è simile a quello visto nel film Guerre Stellari (1977), nel quale la principessa Leia compare in un'immagine olografica 3D al giovane Luke Skywalker.

La tecnica usata è relativamente semplice, e si potrebbe integrare negli schermi di smartphone, tablet e orologi da polso. Questa novità promette quindi cambiamenti molto rilevanti nel nostro modo di usare la tecnologia quotidianamente, e nel giro di pochi anni i costi potrebbero diventare accessibili al grande pubblico.

Da un punto di vista tecnico, i ricercatori di HP Labs hanno modificato uno schermo LCD, usando una trama in scala nanometrica per creare punti chiamati "pixel direzionali", ognuno dei quali indirizza la luce in una direzione diversa. Ogni pixel direzionale si compone a sua volta di tre punti - rosso, blu e verde - ognuno dei quali può "mandare" la luce in una direzione specifica. Questa passa successivamente attraverso una normale struttura a cristalli liquidi, e in questo modo si viene a creare l'immagine definitiva.

Questo sistema, concepito da David Fattal, può creare immagini tridimensionali visibili da 200 diversi punti di vista, mentre per i video questo numero si riduce a 64 - e 30 immagini al secondo. Secondo Fattal il limite dei video dipende solo dalla possibilità di combinare sistema di retroilluminazione e pixel direzionali: la produzione in serie e livello industriale dovrebbe consentire facilmente di ottenere risultati migliori.

Questa tecnologia è migliorabile anche in altri aspetti, per esempio al momento la struttura nanometrica può deviare la luce in solo 14 diverse direzioni, e ce ne vorrebbero di più per immagini di alta qualità. Il sistema inoltre ha bisogno d'immagini compatibili, quindi funziona con i prodotti di applicazioni 3D, ma non con una comune fotografia - che giocoforza non mostra tutte le angolazioni del soggetto.

Pur considerando tali limiti, è un sogno fantascientifico che si realizza, anche perché le possibili applicazioni pratiche per questi schermi sono già più che concrete. Basta pensare agli smartphone, che con un sistema simile potrebbero offrire un nuovo modo d'interazione, dare vita a un'intera generazione di videogiochi, o alimentare un filone di applicazioni che ora possiamo solo immaginare.

Infine ma non ultimo, possiamo anche considerare il fatto che stanno tornando di moda gli orologi da polso, in una forma ricca di tecnologia. Tra qualche anno, magari, potremo usare l'orologio da polso per esaminare una versione tridimensionale di Google Maps, o per mostrare a qualcuno un nostro progetto 3D. Per chi si sta appassionando alla stampa 3D, poi, questa novità non può che essere la benvenuta.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Self-charging battery gets boost from nanocomposite film



a research team at the Georgia Institute of Technology led by Professor Zhong Lin Wang fabricated the first self-charging power pack, or battery, that can be charged without being plugged into a wall socket or other source of electricity. Instead, the battery is charged by applying a mechanical stress, which causes lithium ions to migrate from the cathode to the anode due to the piezoelectric effect. Now the researchers have improved the battery by adding nanoparticles to the battery's piezoelectric material, resulting in a higher charging efficiency and storage capacity.

Along with Wang, Yan Zhang and their other coauthors from Lanzhou University, Northeastern University in Shenyang, and the Chinese Academy of Sciences in Beijing (all in China), and the Georgia Institute of Technology, have published a paper on the improved self-charging battery in a recent issue of Nanotechnology.

The self-charging battery is several hundred micrometers thick and fits inside a stainless steel coin-type cell. By placing it underneath the buttons of a calculator, for instance, the mechanical energy generated by pressing a button can be simultaneously converted from mechanical to chemical energy and stored in the battery. The researchers envision that the battery could one day power a variety of small, portable electronic devices.

"Self-charging power cells charged up by mechanical deformation and vibration from the environment have possible applications for self-powered sensor systems, as well as flexible and portable electronics, such as self-charging flexible mobile phones and human health monitoring systems," Zhang told Phys.org.

The self-charging battery's ability to both convert and store energy is what sets it apart from conventional batteries, whose sole purpose is to store energy. In conventional batteries, the first step of energy conversion (such as mechanical to electrical) is almost always performed by a separate device. The self-charging battery completely bypasses the intermediate step of converting to electricity, resulting in a more efficient conversion and storage process than if two separate devices—and two steps—were used.

To transform a conventional Li-ion battery into a self-charging one, the researchers replaced the polyethylene separator that normally separates the two electrodes in a Li-ion battery with a piezoelectric material that generates a charge when under an applied stress. In the 2012 version, this material was a PVDF film. In the new study, the researchers added lead zirconate titanate (PZT) nanoparticles to the PVDF film to create a nanocomposite.

The addition of the PZT results in significant performance improvements, namely increasing the battery's efficiency and storage capacity by 2.5 times over the earlier version. Specifically, the storage capacity improved from 0.004 to 0.010 µA h.

The researchers explained that these improvements occur due to two mechanisms: first, the PZT induces a geometrical strain confinement effect that increases the piezoelectric potential; and second, the PZT has a porous structure that increases the number of pores in the nancomposite, resulting in a smaller interpore distance that increases the number of ionic conduction paths on which lithium ions can travel. Both mechanisms allow more lithium ions to migrate from the cathode to the anode, increasing the total charge.

The improvements demonstrate that a nanocomposite film can enhance the performance of self-charging batteries, and the researchers plan to make further improvements in the future.

"We need to deeply understand the exact progress of charging electrochemical reactions at the two electrodes, for improving the performance of the self-charging power cells," Zhang said.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Auto Industry Steers to Ultrasonic Splicing



While many engineers remain reluctant to switch from older resistance welding or soldering methods, the auto industry is recognizing the advantages of ultrasonic welding, fast becoming a popular alternative for assembling wire harnesses, especially for lightweight aluminum varieties. The process provides the lowest resistance weld available, allowing for the use of smaller cables, which saves the auto makers money and space. But ultrasonic welding may not be ideal for all wire processing applications.

Marco La Rosa ha detto...

DOTT. COTELLESSA

Microalgae-derived biogas a promising alternative to fossil fuels



Could microalgae fuel the future? Researchers are fine-tuning a technology that transforms wet algal biomass into a biogas that is compatible with today's natural gas infrastructure.

Microalgae derived biogas is becoming an increasingly promising alternative to fossil fuels. Over the past years, researchers at the Paul Scherrer Institute (PSI) and EPFL have been developing SunCHem, a resource and energy efficient process, to cultivate microalgae and convert them into synthetic natural gas, a biofuel that is fully compatible with today's expanding gas grid. In an article published in late January 2014, they present one of the first continuous biomass to biogas conversion technologies. The article appeared online in the journal Catalysis Today.

While it takes nature millions of years to transform biomass into biogas, it takes the SunCHem process less than an hour. The secret behind this feat is a process called hydrothermal gasification. First, algae-rich water is heated under pressure to a supercritical liquid state, to almost 400 degrees Celsius. In this supercritical state, the water effectively dissolves the organic matter contained in the biomass, while inorganic salts become less soluble and can be recovered as a nutrient concentrate. By gasifying the remaining solution in the presence of a catalyst, it is then split into water, CO2, and the methane rich biogas.

Although the approach is still about five to seven times too expensive to compete with natural gas, microalgae evade much of the criticism that other biofuel sources face. They can be grown in raceway ponds built on non-arable land, without competing with agricultural food production. And although the algae need water to grow in, they are not picky. Depending on the species, they can grow in freshwater or saltwater, and in the future, they could potentially even be used to treat wastewater. A study published last year estimated that, for each unit of energy spent to produce the biogas, between 1.8 and most optimistically 5.8 units of energy could be produced.

To save resources, cut costs, and increase the overall efficiency of the process, the entire system can be run in a closed loop. "Some nutrients such as phosphate are limited resources, which we can recover when we gasify the biomass. Feeding them back into the water that we grow the algae in has a spectacular effect on their growth," says Mariluz Bagnoud, one of the two lead authors of the publication.

For the publication, the researchers proved the feasibility of running the system as a continuous process. But they also found that feeding back water and nutrients over long durations leads to a degradation of the system's performance. "We detected the deactivation of the catalyst used in the gasification process and we expect the accumulation of trace amounts of aluminum," says Bagnoud. "The toxicity of the aluminum on the microalgae depends on the pH. By cultivating the algae at a neutral pH, these toxic effects can essentially be eliminated," she says. "Now, the next steps will involve fine-tuning the process to increase the longevity of the catalyst, which is deactivated by the sulfur contained in the microalgae," she concludes.

http://cdn.physorg.com/tmpl/v4/img/1x1.gif Explore further: Wastewater treatment using microalgae enables phosphorous and nitrogen removal in darkness

More information: Mariluz Bagnoud-Velásquez, Martin Brandenberger, Frédéric Vogel, Christian Ludwig; "Continuous catalytic hydrothermal gasification of algal biomass and case study on toxicity of aluminum as a step toward effluents recycling."



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Dutch scientists flap to the future with 'insect' drone



utch scientists have developed the world's smallest autonomous flapping drone, a dragonfly-like beast with 3-D vision that could revolutionise our experience of everything from pop concerts to farming.

"This is the DelFly Explorer, the world's smallest drone with flapping wings that's able to fly around by itself and avoid obstacles," its proud developer Guido de Croon of the Delft Technical University told AFP.

Weighing just 20 grammes (less than an ounce), around the same as four sheets of printer paper, the robot dragonfly could be used in situations where much heavier quadcopters with spinning blades would be hazardous, such as flying over the audience to film a concert or sport event.

The Explorer looks like a large dragonfly or grasshopper as it flitters about the room, using two tiny low-resolution video cameras—reproducing the 3-D vision of human eyes—and an on-board computer to take in its surroundings and avoid crashing into things.

And like an insect, the drone which has a wingspan of 28 centimetres (11 inches), would feel at home flying around plants.

"It can for instance also be used to fly around and detect ripe fruit in greenhouses," De Croon said, with an eye on the Netherlands' vast indoor fruit-growing business.

"Or imagine, for the first time there could be an autonomous flying fairy in a theme park," he said.

'Real small insects'

Unlike other drones that use rotor blades and can weigh hundreds of times as much, the Explorer has two wings on each side that flap rapidly to create lift.

"We got our inspiration from real small insects," De Croon said.



Chief Developer Guido de Croon releases the DelFly Explorer, the world's lightest autonomous flapping drone, during a demonstration at the Delft Technical University, on January 29, 2014

While smaller "flapping" drones exist, such as the RoboBee developed by students at Harvard University in the United States, they are tethered for power, control and processing, and thus far from autonomous.

The Explorer has its own small lithium polymer battery that allows it to fly for around nine minutes, while it "sees" with its onboard processor and a specially-developed algorithm to make instant decisions.

It has wireless analog video, gyroscopes and a barometer to calculate its height.

Different algorithms would allow it to perform different tasks, and because it is autonomous it could be sent into enclosed spaces such as concrete buildings or mine shafts, where radio control would be impossible, to search for casualties or hazards.

"The DelFly knows precisely where obstacles are located," said De Croon as the aircraft, built from composite materials including carbon fibre, fluttered towards a wall during a demonstration flight before veering elegantly away in search of another route.

The idea of building a flapping-winged drone began around nine years ago when a group of students at Delft Technical University's prestigious aerospace faculty first designed the DelFly I.

Over the next few years, research continued and the machine became smaller and smaller, said Sjoerd Tijmons, 28, who helped write the algorithm for the latest DelFly Explorer's "brain".

An earlier incarnation, the DelFly Micro with a wingspan of 10 centimetres, was in 2008 declared the "smallest camera equipped aircraft in the world" by the Guinness Book of Records.

But De Croon admits that humans are not quite able to produce swarms of autonomous robotic insects the size of bees or flies, mainly because of restrictions on battery life.

"Still there are some major challenges... and if I have to put a number on it, I think we are still a few decades away," he laughed.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

An end in sight in the long search for gravity waves



Our unfolding understanding of the universe is marked by epic searches and we are now on the brink of discovering something that has escaped detection for many years.

The search for gravity waves has been a century long epic. They are a prediction of Einstein's General Theory of Relativity but for years physicists argued about their theoretical existence.

By 1957 physicists had proved that they must carry energy and cause vibrations. But it was also apparent that waves carrying a million times more energy than sunlight would make vibrations smaller than an atomic nucleus.

Building detectors seemed a daunting task but in the 1960s a maverick physicist Joseph Weber, at the University of Maryland, began to design the first detectors. By 1969 he claimed success!

There was excitement and consternation. How could such vast amounts of energy be reconciled with our understanding of stars and galaxies? A scientific gold rush began.

Within two years, ten new detectors had been built in major labs across the planet. But nothing was detected.

Going to need a better detector

Some physicists gave up on the field but for the next 40 years a growing group of physicists set about trying to build vastly better detectors.

By the 1980s a worldwide collaboration to build five detectors, called cryogenic resonant bars, was underway, with one detector called NIOBE located at the University of Western Australia.

These were huge metal bars cooled to near absolute zero. They used superconducting sensors that could detect a million times smaller vibration energy than those of Weber.

They operated throughout much of the 1990s. If a pair of black holes had collided in our galaxy, or a new black hole had formed, it would have been heard as a gentle ping in the cold bars… but all remained quiet.

What the cryogenic detectors did achieve was an understanding of how quantum physics affects measurement, even of tonne-scale objects. The detectors forced us to come to grips with a new approach to measurement. Today this has grown into a major research field called macroscopic quantum mechanics.

But the null results did not mean the end. It meant that we had to look further into the universe. A black hole collision may be rare in one galaxy but it could be a frequent occurrence if you could listen in to a million galaxies.

Laser beams will help



A new technology was needed to stretch the sensitivity enormously, and by the year 2000 this was available: a method called laser interferometry.

The idea was to use laser beams to measure tiny vibrations in the distance between widely spaced mirrors. The bigger the distance the bigger the vibration! And an L-shape could double the signal and cancel out the noise from the laser.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SECONDA PARTE

Several teams of physicists including a team at the Australian National University had spent many years researching the technology. Laser beam measurements allowed very large spacing and so new detectors up to 4km in size were designed and constructed in the US, Europe and Japan.

The Australian Consortium for Gravitational Astronomy built a research centre on a huge site at Gingin, just north of Perth, in Western Australia, that was reserved for the future southern hemisphere gravitational wave detector.

The world would need this so that triangulation could be used to locate signals.



Latest detectors



The new detectors were proposed in two stages. Because they involved formidable technological challenges, the first detectors would have the modest aim of proving that the laser technology could be implemented on a 4km scale, but using relatively low intensity laser light that would mean only a few per cent chance of detecting any signals.

The detectors were housed inside the world's largest vacuum system, the mirrors had to be 100 times more perfect than a telescope mirror, seismic vibrations had to be largely eliminated, and the laser light had to be the purest light ever created.

A second stage would be a complete rebuild with bigger mirrors, much more laser power and even better vibration control. The second stage would have a sensitivity where coalescing pairs of neutron stars merging to form black holes, would be detectable about 20 to 40 times per year.

Australia has been closely involved with both stages of the US project. CSIRO was commissioned to polish the enormously precise mirrors that were the heart of the first stage detectors.



A gathering of minds



The Australian Consortium gathered at Gingin earlier this year to plan a new national project.



Part of that project focusses on an 80 meter scale laser research facility – a sort of mini gravity wave detector – the consortium has developed at the site. Experiments are looking at the physics of the new detectors and especially the forces exerted by laser light.

The team has discovered several new phenomena including one that involves laser photons bouncing off particles of sound called phonons. This phenomenon turns out to be very useful as it allows new diagnostic tools to prevent instabilities in the new detectors.

The light forces can also be used to make "optical rods" – think of a Star Wars light sabre! These devices can capture more gravitational wave energy – opening up a whole range of future possibilities from useful gadgets to new gravitational wave detectors.

SEGUE TERZA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

TERZA PARTE

Final stages of discovery

The first stage detectors achieved their target sensitivity in 2006 and, as expected, they detected no signals. You would know if they had!

The second stage detectors are expected to begin operating next year. The Australian team is readying itself because the new detectors change the whole game.

For the first time we have firm predictions: both the strength and the number of signals. No longer are we hoping for rare and unknown events.

We will be monitoring a significant volume of the universe and for the first time we can be confident that we will "listen" to the coalescence of binary neutron star systems and the formation of black holes.

Once these detectors reach full sensitivity we should hear signals almost once a week. Exactly when we will reach this point, no one knows. We have to learn how to operate the vast and complex machines.

If you want to place bets on the date of first detection of some gravity wave then some physicists would bet on 2016, probably the majority would bet 2017. A few pessimists would say that we will discover unexpected problems that might take a few years to solve.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

EV Performance Gains on Combustion Competition



Sure, electric vehicles are environmentally-friendly, but you're not going to win any races in one, right? Wrong. In this article, National Geographic tells us about some races electric vehicles will win: those that are part of the FIA Formula E Championship, described as "the world's first fully electric Grand Prix racing series." Besides these races, which pit vehicles with electric motors against each other, the article looks at improvements in EV technology that deliver surprising performance. Thanks to these improvements and continuing R&D, "performance electric cars" could be on the road toward true competition with internal combustion vehicles.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA
Cheap Gasoline from Natural Gas — Finally?



Oxidative coupling converts natural gas to ethylene. Unfortunately, stopping the reaction before the ethylene becomes carbon dioxide has eluded researchers until now. According to the MIT Technology Review, a new nanowire catalyst prevents the runaway reaction, and can produce large quantities of gasoline at half the cost of an oil-based product.


Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Orbital Origami



Designing a solar array for deployment in space requires compact thinking, so Brigham Young University researchers merged mechanical engineering with origami mathematics. The prototype 25 m diameter, 1 cm thick silicon solar array folds to a compact 2.7 m wide size and generates 150 kW. This artful enterprise should lead to a fiberglass composite producing 250 kW.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Wind Tunnel on Your Desktop



Ever have a project go without wind tunnel analysis because of cost or timeline constraints? The release of Autodesk Flow Design should dramatically alter such budget scenarios. The desktop tool simulates airflow around any object in a virtual wind tunnel, delivering quick trace lines, pressure maps, or cut-planes for understanding and tweaking design performance

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Comeback for Nuclear Power



The nuclear power industry is growing bigger by getting smaller. Generation III power technology — along with advances in small modular reactors — are driving developments in the nuclear power generation market worldwide. Engineer Live looks at major projects on the horizon, and explains why nuclear power is on the way up worldwide.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Exciting Advances Push the Limits of Visualization



Cell signaling and clathrin-mediated endocytosis. Lipid rafts and molecular motion within the membrane. Movement of organelles within a cell and cell division. Protein organization and co-localization. Virus entry.



The question



Many of cell and molecular biology’s most critical events are occurring in the mid- to low-nanometer scale, which has hampered direct study through microscopy. But recent technological advances are shattering the 200 nm light diffraction limit, providing unprecedented and highly detailed views into the life of a cell. Processes that were hidden behind a veil of unresolved features are suddenly being brought into crystal-sharp clarity.

And we’re still in the exciting early stages of the technology’s development. Papers announcing new ways to gain insight into biological questions are appearing rapidly, such as the recent report from Huang,et al,1 from Joerg Bewersdorf’s lab at Yale University and an international team of collaborators.

By taking advantage of new detector technology and developing algorithms that capitalize on the strengths of this new technology, they are able to acquire single-molecule localization super-resolution images that reveal features at 22 nm precision and capture clathrin-coated structures moving at 13 nm/sec.



The barriers



Past studies using single-molecule switching nanoscopy (SMSN) have been limited by the number of photons emitted by single molecule per frame, which places high demands on the detector. With such low signal, most studies are done using back-illuminated electron-multiplying charge-coupled devices (EM-CCDs) because of the technology’s low noise and high quantum efficiency (QE).



But the power of EM-CCDs is somewhat limited inthese types of experiments:

The amplification step lowers the overall signal-to-ratio and halves the effective quantum efficiency to <48%

Image acquisition is slow, typically minutes to hours for SMSN approaches.

Require a tradeoff between image acquisition speed versus wide field-of view .Faster image acquisition is only possible through smaller fields of view.



These limitations are especially problematic when trying to use SMSN for live-cell imaging or high content screening, where fast speeds are essential for achieving meaningful throughputs.

A new generation of sCMOS detectors have a higher quantum efficiency, reaching up to 73% at 600 nm. But a wide and non-uniform pixel-to-pixel variability in noise has prevented early implementations of this technology from optimal use in SMSN.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SECONDA PARTE

The solution



Huang, et al,1 turned to the newer second generation sCMOS sensors (Gen II sCMOS) to achieve low noise and wide fields-of-view with fast image acquisition times in both live and fixed cells.

The key advance Huang, et al,1 made in implementing Gen II sCMOS was the recognition that this novel technology is inherently different from EM-CCD, and that—particularly in SMSN conditions where noise is a major limitation and precise quantitative data analysis is crucial—different image processing algorithms are needed to yield optimal results

By thoroughly characterizing their Gen II sCMOS camera, they were able to develop algorithms that accounted for observed noise more effectively than just using a Poisson model that is typically done for EM-CCD detectors.

Implications for high content screening

Imaged focal adhesion protein paxillin labeled with Alexa Fluor 647 in a 26 x 26 μm field of view and at 800 frames per second.
Potential to record 1,000 different cells per hour with average precision of 22 μm.


mplications for live cell imaging

Imaged clathrin-coated structures at an average precision of 22 nm using super-resolution image based on 34,800 camera frames acquired over 58 seconds. These structures often moved in a directed fashion at a speed of ~13 nm/sec.



Implications for using Gen II sCMOS cameras

Through their characterization, Huang, et al,1 show distinct regions where sCMOS or EM-CCD are more effective detectors.


Understanding the math

The key to Huang, et al.'s,1 success with Hamamatsu's

sCMOS camera lies in careful consideration of noise—
read more about how they dealt with noise in sCMOS
and EM-CCDs.

Accounting for imperfection

No camera is perfect. As Huang, et al,1 powerfully illustrate, understanding your camera's imperfections can lead to better performance and better science. Read more about how to correct for noise for computational microscopy in Bridging the Gap.



We are just now at the beginning of a visualization revolution, as teams of biologists, chemists, physicists and engineers labor to develop the best methods and instruments to extract as much insight into the underlying nature of biology.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Nuova alleanza nel settore fotovoltaico: SMA Solar Technology AG e Danfoss A/S creano la più grande partnership per il settore dei convertitori



Una partnership strategica è quella siglata tra SMA Solar Technology AG (SMA/FWB: S92) e Danfoss A/S. Le due aziende leader nella tecnologia fotovoltaica si sono unite per puntare insieme a una sensibile riduzione dei costi, sfruttando le economie di scala e l'esperienza comune. Danfoss ha acquisito una quota del 20% di SMA e ha in programma di cedere a quest'ultima tutte le attività legate al comparto degli inverter fotovoltaici.



“La collaborazione strategica con Danfoss rafforzerà la leadership di SMA nel mercato mondiale del fotovoltaico. Il mercato è oggi estremamente competitivo e dobbiamo affrontare una forte pressione sui prezzi. Grazie a questa nuova alleanza strategica, potremo trarre vantaggio dalla pluriennale esperienza di Danfoss nel settore dei dispositivi di automazione industriale, che da tempo sta vivendo una spietata concorrenza. Di conseguenza, il gruppo Danfoss ha orientato la propria strategia verso il miglioramento continuo grazie alla riduzione dei costi e all’approvvigionamento internazionale. Entrambe le aziende beneficeranno quindi di quest'alleanza strategica per una riduzione sostenibile dei costi”, spiega Pierre Pascal Urbon, CEO di SMA.



Grazie a questa partnership, SMA potrà inoltre ampliare la propria offerta di prodotti tramite l'acquisizione del settore degli inverter fotovoltaici di Danfoss. Una volta approvata la transazione, SMA introdurrà nuovi prodotti per il segmento di mercato degli impianti fotovoltaici di medie dimensioni, in costante crescita in Europa, Stati Uniti e Cina.



“Grazie a questa partnership le due aziende leader nella produzione di inverter costituiranno una delle alleanze più grandi a livello mondiale del settore. L'acquisizione di una quota di partecipazione del 20% di SMA rappresenta un segnale forte a conferma del nostro impegno continuo e della fiducia nel mercato del fotovoltaico. Trasferiremo la nostra pluriennale esperienza nell'ambito della tecnologia di azionamento al settore degli inverter fotovoltaici, accelerando così l'innovazione. Danfoss beneficerà delle economie di scala e della rapida crescita dell'industria fotovoltaica nei prossimi anni”, spiega Niels B. Christiansen, Presidente e CEO di Danfoss.



Danfoss acquisirà 6,94 milioni di azioni di SMA Solar Technology AG a un prezzo di 43,57 Euro dai fondatori dell'azienda, dalle loro famiglie e dalle fondazioni. Il prezzo d'acquisto corrisponde a un sovrapprezzo del 50% rispetto al prezzo medio ponderato per i volumi degli ultimi 60 giorni. Il volume della transazione ammonta a 302,38 milioni di Euro. Le azioni flottanti di SMA si attesteranno al 25,05% al termine della transazione. I fondatori di SMA con le loro famiglie e le fondazioni manterranno il possesso del 54,95% delle azioni di SMA. Nei prossimi due anni Danfoss non acquisterà o venderà ulteriori quote di SMA (periodo di lock-up). La transazione avverrà previa autorizzazione delle autorità preposte. La vendita delle azioni e il contratto di collaborazione saranno messi a punto durante il terzo trimestre del 2014.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Molded Carbon-fiber Competes with Metal



It took time, but some thermoplastic compounds can now compete with metals in terms of stiffness and strength. Ultra Performance, a series of carbon-fiber compounds for injection molding, are touted as replacements for magnesium, zinc, and aluminum. They're made of PEEK, PPA, PPS, or PEI resins reinforced with 20% to 40% short carbon fibers.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Robot arms to help knit replacement human body parts



You might be able to knit a pair of socks, but robots could one day knit you a pair of kidneys. Bioprinters with many arms could coordinate their limbs to knit together different types of human tissue and mass-produce replacement organs, cartilage or muscle. Bioengineer Ibrahim Ozbolat and his team at the University of Iowa have successfully used a 3D printer fitted with two robot arms to create tissue of two key types.

Until now, bioprinting has largely involved depositing a single type of human cell – like skin cells – onto a scaffold that is later dissolved. But the scaffold approach cannot produce 3D analogues of cartilage and muscle, for instance, which incorporate strong, fibrous cellular-material as well as individual cells of other types.

So Oxbolat's team equipped a 3D printer with two robot arms so that it can simultaneously deposit filaments and cells. They fitted a nozzle to one arm and used it to create multilayer patterns with filaments of sodium alginate. The other arm populated the gaps between the filaments with cells that grow into cartilage.

The result was a 20-layer stack of tissue, 20 millimetres square, made of filaments and living cells, that did not require a scaffold.

Tissue issue

"A third or fourth arm could now be added, as on a robotics assembly line, with each depositing different components of tissue – whether it is blood vessels, connective tissue or organ specific tissue," Ozbolat says.

Sheila MacNeil, a tissue engineer at the University of Sheffield's Centre for Biomaterials and Tissue Engineering in the UK, says the idea has promise, but wonders about the strength of the tissue. "To me, filaments mean fibre with some mechanical integrity and, as these are gels of alginate, I suspect the mechanical integrity is not very high," she says.

"However I could see the current system being a good experimental model for drug development, where having lots of replicated cells within a structure allows multiple dose responses to be tested."

The centre's engineering specialist, Patrick Smith, says multiple robot arms are key to this kind of technology. "Systems that allow multi-material deposition are the next step forward for additive manufacturing. And systems that can deposit combinations of functional materials in particular.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Rolls Royce said to be developing drone cargo ships



Within a decade, unmanned freighters could be operating in regions like the Baltic Sea, and some think they could be safer, cheaper, and cleaner than manned cargo vessels. But regulatory and labor concerns remain.



Who needs maritime crews to run cargo ships?

That seems to be the question Rolls-Royce is trying to ask, as it works to develop what amounts to drone freighters, ships that could one day radically disrupt the massive global shipping industry.

According to Bloomberg, Rolls-Royce's Blue Ocean development team has been working on the drone freighters in a bid to make shipping cheaper, cleaner, and safer. The organization is running a virtual-reality prototype at an office in Norway that mimics a 360-degree view from the ship's bridge. Bloomberg reported that the ships could be deployed in regions like the Baltic Sea within a decade, though regulatory and labor concerns could delay adoption elsewhere. Bloomberg said crew costs on freighters run about $3,300 per day, amounting to 44 percent of total operating expenses.

In the meantime, Bloomberg reported that the European Union has put $4.8 million into a project known as Maritime Unmanned Navigation through Intelligence in Networks. That effort is aimed at developing and verifying "a concept for an autonomous ship, which is defined as a vessel primarily guided by automated on-board decision systems but controlled by a remote operator in a shore side control station."

There are, of course, other approaches to trying to reduce shipping costs. One is to outfit the vessels with a group of metal sails that is thought to reduce fuel costs by as much as 30 percent.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Kenya Doubles Down on Geothermal



Kenya's looking to bring down power costs by using its expansive geothermal reserves. Not only will the country increase power generation from existing plants, it is also adding a new 100 MW plant, all to help lower use of diesel-fired power plants. Considering Kenya's capacity for producing over 7000 MW from geothermal, could even more be on the way?

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

UAE All in with Nukes



Even with two plants already under construction, the Emirates Nuclear Energy Corporation plans to break ground on a third nuclear power plant in the coming months. By 2020, the United Arab Emirates will have four nuclear power plants up and running, producing 5600 MW of electricity. The four plants are being built by Korea Electric Power.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

X-ray the Wafer



The first rule of semiconductor inspection and test states that the test must not compromise device performance. Capitalizing on the capabilities of its existing X-ray equipment, this company has introduced a wafer inspection system that can find and evaluate defects in both visible and hidden circuit features, including through-silicon vias (TSVs), 2.5D and 3D device packages, MEMS, and wafer bumps

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Build Your Own 'Bot

Imagine producing custom grippers or robotic arms cheaply, in-house, on 3D printers. That is already happening in the prosthetics world; the cost of Robohand is about 1% of that of a conventional prosthesis. Five plastic fingers, constructed on a 3D printer and sized for individual users, are held together with cables and screws. Construction cost is $100. Watch the project beginnings here. In a heart-warming step, the hands can be constructed in small facilities in war-torn areas, and the design is open-source.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Optical Sensors for Distance Measurement

Featuring a high-speed charge transfer structure in each pixel, Hamamatsu's image sensors enable high-precision TOF (time-of-flight) distance measurements. These image sensors can be used in various applications, including shape detection by industrial robots, object detection in semiconductor wafer transfer systems, and people/obstacle detection in automobiles.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Seeing Can Be Tough Work



Robots work in harsh environments that can cause machine vision systems to struggle. Software that recognizes and eliminates effects from bright light, specular surfaces, and deep shadows significantly improves output from vision systems. Image processing software can ignore irrelevant objects by comparing what it sees to what it expects, and ignoring those items that don't match. Other ruggedizing steps include carefully selecting jacketing material and conductor size.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Whiskered Robots



At night, cats use their whiskers to hunt. Now robots have whiskers too. E-whiskers — tactile sensors — can determine the texture and shape of objects or detect just a whiff of a breeze, a useful skill for locating gas leaks. Made of lightweight carbon nanotubes, e-whiskers use tiny silver particles to tune their acuteness, with a sensitivity of 1 Pascal of pressure. Watch the whiskered "shrewbot."

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Relying on Roaches



Swarms of cockroach cyborgs, called biobots, can be used to create maps of collapsed buildings and rubble, critical tools for first responders. The biobots enter collapsed buildings, disperse randomly, then sidle along any encountered walls. When they get close to each other, biobots send signals to handlers. Using this data and a handy algorithm, operators create maps of the rubble, marking sources of radioactive or chemical signatures

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

World's tallest hotel opens second tower in Dubai



The second tower at JW Marriott Marquis Hotel Dubai, the world’s tallest hotel, is now open, with 294 new rooms bringing the total room count in the hotel to 1098.

The hotel, which opened 804 rooms in Tower 1 in November 2012, will offer 1608 rooms across the two towers when complete.

On June 1, 1350 rooms will be operational and on September 1, the hotel will reach the final count of 1608 rooms, a spokesperson told Hotelier Middle East.

There will also be two new F&B outlets in Tower 2, which is identical in design to Tower 1.

One is rumoured to be a Latin bar, while at the top of the hotel, the 72nd floor may be operated as a function space for hire.

“We are imagining amazing parties and weddings and perhaps even pop-up restaurants,” said the spokesperson, although the use of the space is yet to be confirmed.

JW Marriott Marquis Dubai already operates 13 bars and restaurants, including Prime 68, GQ Bar Dubai, Tong Thai, Positano and Rang Mahal by Atul Kochhar.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

A Tin Superconductor



Scientists have discovered a material that acts much like a superconductor at and above room temperature, leading to faster, more efficient microchips. Called stanene and composed of a single-atom-thick sheet of tin, this topological insulator promises to enable electrons to flow without resistance. Used to connect the various components of microchips, stanene could eliminate interference, boost processing speed, and reduce electronics power consumption.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

When Sunny Gets Blue



To squeeze more energy out of the blue end of the light spectrum, U.S. researchers turned to copper indium selenide. Depositing thin films of the nanocrystalline material by rapid heating and cooling of the top layer, or photonic curing, supports multiple exciton generation. The process enhances the capacity of cells to harvest the excess energy of blue photons.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Solar Thermal Technology Poses Challenges for Drought-Stricken California



Reducing water consumption at solar thermal plants raises costs and decreases power production.



California’s ambitious goal of getting a third of its electricity from renewable energy sources by 2030 is being tested by its driest year on record, part of a multiyear drought that’s seriously straining water supplies. The state plan relies heavily on solar thermal technology, but this type of solar power also typically consumes huge quantities of water.

The drought is already forcing solar thermal power plant developers to use alternative cooling approaches to reduce water consumption. This will both raise costs and decrease electricity production, especially in the summer months when demand for electricity is high. Several research groups across the country are developing ways to reduce those costs and avoid reductions in power output.

Solar thermal power plants use large fields of mirrors to concentrate sunlight and heat water, producing steam that spins power-plant turbines. Utilities like them because their power output is much less variable than power from banks of solar panels”.

The drawbacks are that solar thermal plants generate large amounts of waste heat, and they consume a lot of water for cooling, which is usually done by evaporating water. Solar thermal plants can consume twice as much water as fossil fuel power plants, and one recently proposed solar thermal project would have consumed about 500 million gallons of water a year.

A technology called dry cooling, which has started appearing in power plants in the last 10 years or so, can cut that water consumption by 90 percent. Instead of evaporating water to cool the plant, the technology keeps the water contained in a closed system. As it cools the power plant, the water heats up and is then circulated through huge, eight-story cooling towers that work much like the radiator in a car.

Dry cooling technology costs from two and a half to five times more than conventional evaporative cooling systems. And it doesn’t work well on hot days, sometimes forcing power plant operators to cut back on power production. In the summer, this can decrease power production by 10 to 15 percent, says Jessica Shi, a technical program manager at the Electric Power Research Institute. On extremely hot days, power production might be reduced even more than that.

One approach to solving this problem is to oversize the cooling system so that it can deliver enough cooling even on hot days. That’s the approach taken by the developers of California’s new Ivanpah solar thermal plant, which is about to start. But it adds to the cost of an already expensive system.

More than a dozen research groups funded by the Electric Power Research Institute and the National Science Foundation are developing ways to avoid the current problems with dry cooling technology. One project uses a conventional evaporative cooling system but captures the water vapor to reuse it. Others are working to improve the efficiency of dry cooling towers so that they can be made smaller and cheaper. A third approach is to use nanoparticles in the cooling fluid to improve its ability to absorb heat. And new designs that improve air circulation could reduce the size and cost of cooling towers.

The drought and water shortage that California is undergoing will increase the costs associated with solar thermal power, but they aren’t likely to bring the spread of the technology screeching to a halt. While dry cooling costs far more than conventional water cooling, it accounts for a relatively small part of the total cost of a plant—about five percent of around $2 billion.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Agilent: microscopio a forza atomica al Cambridge Graphene Centre



Agilent Technologies ha effettuato l’installazione di un microscopio a forza atomica, modello Agilent 5600LS presso ilCambridge Graphene Centre (CGC) nel Regno Unito.

Cambridge Graphene Centre è uno dei principali partner nel progetto ‘Future and Emerging Technologies Graphene Flagship’ diretto da Andrea Ferrari, professore di nanotecnologie all’Università di Cambridge.

Il microscopio Agilent 5600LS AFM, introdotto nell’ottobre 2013, sarà utilizzato per attività di ricerca sul grafene e su altri materiali bidimensionali.

Gran parte dell’interesse verso il grafene deriva dalle sue uniche proprietà elettroniche, che possono contribuire a realizzare dispositivi ad altissima velocità.

In particolare Agilent ha sviluppato SMM, una tecnica di caratterizzazione basata su AFM che permette di esplorare le caratteristiche del grafene (come capacità elettrica, impedenza e proprietà dielettriche) su scala nanometrica.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Congo using robocops to ease traffic congestion



Kinshasa in the Democratic Republic of Congo has two robocops on patrol. Well, they are technically stationary, but they're still keeping pedestrians in the country's capital safer. The city of about 10 million people suffers from choking traffic, and the eight-foot-tall, aluminum and steel robots are installed at two, high-traffic intersections to regulate traffic flow.

The $15,000, solar-powered bots were installed in June 2013 and were engineered by a team of local engineers to withstand the country's sweltering heat. So far they have been deemed a complete success. Their arms act as traffic signals, while their chests display whether it is safe for walkers to cross the street. A speaker also says whether it is safe to cross. Surveillance cameras are also mounted in the shoulders in case anyone attempts to disobey the traffic automaton's will. "With the robots' policemen intelligence, the road safety in Kinshasa becomes very easy," said Vale Manga Wilma, president of the DRC's National Commission for Road Safety to CNN.

While giant, humanoid traffic signal robots sound like something more likely to come out of Japan than the Democratic Republic of Congo, they merge the functions of human traffic officers and signal lights which means more cops patrolling the streets.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Schoolboy, 13, creates nuclear fusion in Penwortham



13-year-old Lancashire schoolboy has become one of the youngest people in the world to carry out nuclear fusion.

Jamie Edwards, a pupil at Penwortham Priory Academy, created the project from scratch with help from his school.

"I can't quite believe it - even though all my friends think I am mad," he said.

The last record holder was US student Taylor Wilson, who was 14 when he created nuclear fusion in 2008.

Jamie, who started work in October in an under-used school science laboratory, recreated a process known as 'inertial electrostatic confinement' which dates back to the 1960s.

'Star in jar' A kind of nuclear fusion

This type of fusion has been known about since the 1960s.

A high voltage is put through a confined gas creating tiny pockets hotter than the surface of the Sun.

Some charged hydrogen atoms can fuse together to produce a helium nucleus and a few neutron particles.

Care needs to be taken because of the high voltage production of a small amount of radiation, including some X-rays.

The process is called inertial electrostatic confinement.



"One day, I was looking on the internet for radiation or other aspects of nuclear energy and I came across Taylor Wilson," said the junior scientist who faced a race against time to complete the project before his 14th birthday on Sunday.

"I looked at it, thought 'that looks cool' and decided to have a go."

"You see this purple ball of plasma - basically it's like a star in a jar," he added.

Jamie, along with friend George Barker, set about trying to create nuclear fusion by consulting an open source website for amateur physicists.

His application for funds was rejected by various nuclear laboratories and universities.



School funding



Jamie set about trying to create nuclear fusion by consulting an open source website

"They didn't seem to take me seriously as it was hard to believe a 13-year-old would do something like that so I went to my head teacher Mr Hourigan in October," he said.

"I was a bit stunned and I have to say a little nervous when Jamie suggested this but he reassured me he wouldn't blow the school up," said Priory head Jim Hourigan, who agreed to give £2,000 to the project.

Jamie ordered parts and equipment from Lithuania, the US and UK, working on the project every break and lunchtime as well as after school.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Waste Heat Recovery Using Kalina Cycle Systems



There is an increasing emphasis on finding clean and renewable electricity using innovative waste heat-to-power conversion systems. The use of ammonia/water mixtures as the working fluid in a power generation cycle, known as the Kalina Cycle, is one method gaining increasing acceptance for meeting this challenge.

By using two fluids with different boiling points, the Kalina cycle mixture evaporates over a range of temperatures based on the mix ratio, rather than at a single, fixed temperature. This enables the boiling point to be adjusted to suit the heat source temperature of the site and makes the Kalina cycle well-suited for applications with industrial process waste heat, as well as naturally occurring geothermal and solar energy sites, to recover more energy then what may be possible with a steam Rankine cycle waste heat recovery system.

In this Webinar, Frank Di Bella, CN's Program Manager, Large Product Development and Corporate Fellow, will review the ammonia-water thermodynamics that are associated with each of the principal Kalina components. He will also help identify the advantages and design considerations involved in the specification of these principal components with the primary objective of making more power from the same waste heat source.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

ORC Heat-to-Power Systems



In recent years, the worldwide market for clean, renewable electricity has skyrocketed. Organic Rankine Cycle (ORC) heat-to-power systems are part of this surge in clean electricity generation. Unlike the intermittent power production resulting from wind and solar photovoltaic plants, ORC systems can continuously produce renewable electricity day and night. Rather than using nuclear or fossil fuels, ORC systems are powered by industrial waste heat, geothermal, biomass or other clean heat sources.

In this Webinar, Keith D, Patch, CN's ORC Product Manager, will cover the basics of ORC technology and the application of ORC systems to various renewable heat sources and markets. He will also then discuss the broad applicability of CN's new patent-pending CN300 ORC System, which ranges from high-temperature industrial waste heat recovery to low temperature binary geothermal sites.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

La partecipazione ENEA al Progetto europeo per lo sviluppo di nuove generazioni di pompe di calore a basso impatto ambientale



L’ENEA è uno dei partner del progetto europeo NxtHPG -“Next Heat Pump Generation working with natural fluids” - che mira alla realizzazione di nuovi prototipi di pompe di calore che utilizzano fluidi refrigeranti naturali per un minor impatto ambientale. I partner di questo progetto sono costituiti da università, centri di ricerca, ma anche aziende leader nel settore delle pompe di calore di diversi paesi europei. Il progetto mira allo sviluppo di un sistema sicuro, affidabile e ad alta efficienza basato sull’utilizzo di pompe di calore a propano e CO2. Per discutere delle varie fasi riguardanti la progettazione e la realizzazione di cinque nuovi prototipi di pompe di calore si è recentemente tenuto il meeting annuale del progetto presso il Centro Ricerche ENEA della Casaccia.



L’ENEA è stata direttamente coinvolta nella progettazione e realizzazione dei due prototipi a CO2, il primo da 30 kW per la produzione di acqua calda sanitaria ed il secondo da 50 kW pensato per il riscaldamento degli ambienti in sostituzione delle tradizionali caldaie a gas, abbinate ad impianti esistenti del tipo a radiatori.

L’intera campagna sperimentale dei due prototipi a CO2 si svolgerà all’interno del nuovo calorimetro realizzato dall’ENEA, appena inaugurato presso il Centro Ricerche della Casaccia, che ha dimensioni interne tali da consentire il test di pompe di calore di potenza termica prossima a 50 kW.







Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Big Step for Next-Generation Fuel Cells and Electrolyzers



Researchers at Berkeley and Argonne National Labs Discover Highly Promising New Class of Nanocatalyst



A big step in the development of next-generation fuel cells and water-alkali electrolyzers has been achieved with the discovery of a new class of bimetallic nanocatalysts that are an order of magnitude higher in activity than the target set by the U.S. Department of Energy (DOE) for 2017. The new catalysts, hollow polyhedral nanoframes of platinum and nickel, feature a three-dimensional catalytic surface activity that makes them significantly more efficient and far less expensive than the best platinum catalysts used in today’s fuel cells and alkaline electrolyzers. This research was a collaborative effort between DOE’s Lawrence Berkeley National Laboratory (Berkeley Lab) and Argonne National Laboratory (ANL).



“We report the synthesis of a highly active and durable class of electrocatalysts by exploiting the structural evolution of platinum/nickel bimetallic nanocrystals,” says Peidong Yang, a chemist with Berkeley Lab’s Materials Sciences Division, who led the discovery of these new catalysts. “Our catalysts feature a unique hollow nanoframe structure with three-dimensional platinum-rich surfaces accessible for catalytic reactions. By greatly reducing the amount of platinum needed for oxygen reduction and hydrogen evolution reactions, our new class of nanocatalysts should lead to the design of next-generation catalysts with greatly reduced cost but significantly enhanced activities.”



Peidong Yang is a chemist and leading authority on nanomaterials who holds joint appointments with Berkeley Lab, UC Berkeley and the Kavli Energy NanoSciences Institute at Berkeley.

Yang, who also holds appointments with the University of California (UC) Berkeley and the Kavli Energy NanoSciences Institute at Berkeley, is one of the corresponding authors of a paper in Science that describes this research. The paper is titled “Highly Crystalline Multimetallic Nanoframes with Three-Dimensional Electrocatalytic Surfaces.” The other corresponding author is Vojislav Stamenkovic, a chemist with ANL’s Materials Science Division, who led the testing of this new class of electrocatalysts.

Fuel cells and electrolyzers can help meet the ever-increasing demands for electrical power while substantially reducing the emission of carbon and other atmospheric pollutants. These technologies are based on either the oxygen reduction reaction (fuel cells), or the hydrogen evolution reaction (electrolyzers). Currently, the best electrocatalyst for both reactions consists of platinum nanoparticles dispersed on carbon. Though quite effective, the high cost and limited availability of platinum makes large-scale use of this approach a major challenge for both stationary and portable electrochemical applications.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SECONDA PARTE

“Intense research efforts have been focused on developing high-performance electrocatalysts with minimal precious metal content and cost,” Yang says. “In an earlier study, the ANL scientists showed that forming a nano-segregated platinum skin over a bulk single-crystal platinum/nickel alloy enhances catalytic activity but the materials cannot be easily integrated into electrochemical devices. We needed to be able to reproduce the outstanding catalytic performance of these materials in nanoparticulates that offered high surface areas.”

Yang and his colleagues at Berkeley accomplished this by transforming solid polyhedral bimetallic nanoparticles of platinum and nickel into hollow nanoframes. The solid polyhedral nanoparticles are synthesized in the reagent oleylamine, then soaked in a solvent, such as hexane or chloroform, for either two weeks at room temperature, or for 12 hours at 120 degrees Celsius. The solvent, with its dissolved oxygen, causes a natural interior erosion to take place that results in a hollow dodecahedron nanoframe. Annealing these dodecahedron nanoframes in argon gas creates a platinum skin on the nanoframe surfaces.

“In contrast to other synthesis procedures for hollow nanostructures that involve corrosion induced by harsh oxidizing agents or applied potential, our method proceeds spontaneously in air,” Yang says. “The open structure of our platinum/nickel nanoframes addresses some of the major design criteria for advanced nanoscale electrocatalysts, including, high surface-to-volume ratio, 3-D surface molecular accessibility, and significantly reduced precious metal utilization.”



In electrocatalytic performance tests at ANL, the platinum/nickel nanoframes when encapsulated in an ionic liquid exhibited a 36-fold enhancement in mass activity and 22-fold enhancement in specific activity compared with platinum nanoparticles dispersed on carbon for the oxygen reduction reaction. These nanoframe electrocatalysts, modified by electrochemically deposited nickel hydroxide, were also tested for the hydrogen evolution reaction and showed that catalytic activity was enhanced by an order-of-magnitude over platinum/carbon catalysts.

“Our results demonstrate the beneficial effects of the hollow nanoframe’s open architecture and surface compositional profile,” Yang says. “Our technique for making these hollow nanoframes can be readily applied to other multimetallic electrocatalysts or gas phase catalysts. I am quite optimistic about its commercial viability.”

SEGUE TERZA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

TERZA PARTE

Other co-authors of the Science paper in addition to Yang and Stamenkovic are Chen Chen, Yijin Kang, Ziyang Huo, Zhongwei Zhu, Wenyu Huang, Huolin Xin, Joshua Snyder, Dongguo Li, Jeffrey Herron, Manos Mavrikakis, Miaofang Chi, Karren More, Yadong Li, Nenad Markovic and Gabor Somorjai.

This research was funded by the DOE Office of Science.

Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit www.anl.gov

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit the Office of Science website at science.energy.gov/.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Robot Plays Table Tennis



Industrial mechanic Ulf Hoffmann built the robot called UHTTR-1, which plays table tennis with the aid of its own camera. It doesn’t miss a shot, it's just not as aggressive as some players I’ve seen -yet. But what a great practice partner!

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

3D Printed Tissue Integrated with Human Blood Vessel Cells



Artificially printed tissue may one day revolutionize how diseased organs are treated, fractured bones are fixed, and drugs are tested without human subjects. A great deal of progress has already been made toward this goal, but a major hurdle, that of integrating vasculature and bringing together different cell types into a functional whole, has significantly limited that progress. Now researchers at Harvard University report in journal Advanced Materials on a new method of 3D printing constructs made of three different cell types, including ones that line blood vessel walls.



Multimaterial 3D printing can be achieved using four independently addressable printheads. This fluoresence image shows a 4-layer lattice printed by sequentially depositing four PDMS inks, each dyed with a different fluorophore. Because any substantially large chunk of tissue requires oxygen to penetrate into its interior, the vessels within the construct allow for much larger pieces of printed tissue to be created.

Some details from a Harvard news release:

To print 3D tissue constructs with a predefined pattern, the researchers needed functional inks with useful biological properties, so they developed several “bio-inks”—tissue-friendly inks containing key ingredients of living tissues. One ink contained extracellular matrix, the biological material that knits cells into tissues. A second ink contained both extracellular matrix and living cells.

To create blood vessels, they developed a third ink with an unusual property: it melts as it cools, rather than as it warms. This allowed the scientists to first print an interconnected network of filaments, then melt them by chilling the material and suction the liquid out to create a network of hollow tubes, or vessels.

The Harvard team then road-tested the method to assess its power and versatility. They printed 3D tissue constructs with a variety of architectures, culminating in an intricately patterned construct containing blood vessels and three different types of cells—a structure approaching the complexity of solid tissues.

Moreover, when they injected human endothelial cells into the vascular network, those cells regrew the blood-vessel lining.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

There's No Fooling This "Electric Eye"



Unlike many photoelectric sensors that don't always "see" transparent objects, this photoelectric device's "clear object detection" model spots transparent glass or plastic bottles, jars, and other objects. It also comes in reflective, polarized reflex, and through-beam sensing models. The IP66-rated sensor features a compact 38 mm x 13 mm package size. A potentiometer adjusts sensitivity and a visible red LED beam simplifies setup and alignment.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

The New F1 Cars Sound Like Living, Breathing Machines



One of the most visceral things about auto racing is the sound race cars make. It’s not the noise, though it is hard to deny the appeal of a V12 at full throttle, but the feeling it gives you. The rhythmic pulse of controlled explosions within an internal combustion engine reverberate through the body, giving you a sense of the power, precision and risk involved in motorsports.

Different cars produce different sensations. Stand behind a top-fuel dragster when it launches and your entire body literally vibrates. A pack of Nascar racers sounds and feels like 1 million lawnmowers blasting past at 200 mph. And then there are Formula 1 cars, which sound like jets turned up to 11. There’s nothing quite like the sensory overload of an F1 car running flat-out.

Which is why so many F1 fans are so bent out of shape over the new technical regulations. The new cars simply don’t sound, or feel, like the old ones.

Beyond the change from naturally aspirated 2.4-liter V8s to turbocharged 1.6-liter V6s, the rules limit engines to “only” 15,000 RPM, down from 18,000 last year. Blown diffusers, which used exhaust gasses to increase downforce and gave some cars a distinctive bark upon deceleration, also are banned. As a result, the cars are significantly quieter–so much so that many spectators will no longer need ear protection.

Though the new engines don’t scream quite so loud as before, they have their own charm. We suspect race fans will grow to love them with their turbo whine and rumbling exhaust.

We’ve only had the pre-season tests in Jerez and Bahrain to hear the new powerplants, so we’ll have to wait for next weekend’s race in Melbourne to hear what they sound like at race pace. But so far, we like what we hear. Yes, they aren’t as loud as the old engines, but there is a lovely burble and rumble that makes the engines seem like living things. And when drivers lift off the throttle, you can hear the high-pitched whine of the turbo screaming as the car takes a corner. There’s some personality behind that mechanical fury.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

New Stealth Spy Drone Already Flying



The latest top secret unmanned spy plane to be uncovered isn’t just a design idea, it’s already flying at the Air Force’s famed Area 51. Unlike the recently announced SR-72, the new RQ-180 from Northrop Grumman is believed to be currently in flight testing according to Aviation Week and Space Technology.

The RQ-180 is a new design aimed at intelligence, surveillance and reconnaissance (ISR, a.k.a. spying) and incorporates stealth technology, in addition to an efficient new design that’s tailored to flights over countries where the red carpet isn’t being rolled out for current U.S. spy drones.

It’s the successor to the Lockheed Martin RQ-170 Sentinel, known as the “Beast of Kandahar” for its countless missions out of Afghanistan since 2007. It is assumed the RQ-170 has flown missions over Iran and Pakistan, but the aircraft lacks the endurance of other unmanned aircraft, somewhat limiting its capabilities. Iran displayed what is claimed to be a captured RQ-170 in December 2011. The U.S. Air Force would only acknowledge that it lost control of an RQ-170 over western Afghanistan at the same time.

The new RQ-180 is thought to largely address the problem of flying in hostile airspace through improved stealth design and better aerodynamics. According to Aviation Week, the unmanned spy plane would allow the Air Force to expand ISR capabilities beyond the “permissive environments — such as Iraq and Afghanistan,” where current drones such as the Global Hawk and Predator/Reaperoperate. Instead the RQ-180 would be able to fly undetected in airspace where the U.S. does not have permission and/or the protection needed to fly.

This denied airspace capability has been missing from the Air Force’s inventory since the speedy SR-71 retired in 1998. The Blackbird mainly relied on its speed and altitude — along with some stealth-like qualities — to fly over countries and gather intelligence where the U.S. was not welcome.

Aviation Week points to financial reports from Northrop Grumman that suggest the possibility of the new airplane, as well as satellite images of the company’s facility in Palmdale, California and Area 51 that show new hangars capable of holding aircraft with a wingspan of at least 130 feet — larger than a Boeing 737. When asked about the existence of the RQ-180, the Air Force told the trade publication that it “does not discuss this program.”

The use of unmanned aircraft for spying continues to rise year after year. But most of the work is done by slow flying aircraft such as the Global Hawk and Predators. In addition to flying relatively slow, these airplanes are also far from invisible to radar. Most of their use has been limited to flying over areas where manned fighter aircraft are able to control the skies, providing protection for the vulnerable drones.

The RQ-180 on the other hand is expected to have a stealth design with greatly improved aerodynamics giving it greater efficiency, which in the case of ISR work, translates to longer missions which could include longer transits to a target area, or more time over the target.

Northrop Grumman has also been publicly flight testing its X-47B unmanned combat aircraft, including take offs and landings from an aircraft carrier. The X-47B is aimed at combat as well as intelligence gathering, and is being developed for the Navy.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

This Drone Can Fly, Swim, Drive, and Hop Its Way Through a Mission



The future of military drones isn’t surveillance and dropping bombs. It’s transformation: a single unmanned vehicle that can fly, swim, drive, and even hop like a frog across a variety of terrains and obstacles.

Conceived by the Intelligent Systems, Robotics and Cybernetics unit at Sandia National Laboratories, the “Multi-Modal Vehicle Concept” would travel land, sea, and air by transforming itself to accommodate different terrains. Its wings become fins as it dives into water, or underwater paddles that shed casings to reveal wheels as it moves toward land — wheels with the ability to jump 30 feet into the air. An entire campaign could be conducted by a remote operator or, more likely, semi-autonomously.

As it stands now, carrying out a similar mission would require coordinating a team of unmanned aerial, undersea, and ground vehicles made by different manufacturers with different communications systems. It would take careful planning to make sure all vehicles are in place at the right time. But Sandia says that because the Multi-Modal Vehicle is designed modularly and works off one interface, it won’t be subject to those same hang-ups, and that it can adapt mid-mission as conditions change.

“The real value added [of the Multi-Modal Vehicle] is that it allows maximum flexibility in highly complex missions without the concern over whether or not all of the vehicles are positioned just right,” said Jon Salton, a Sandia engineer working on the project.

Sandia has such high aspirations for the Multi-Modal Vehicle that they say it might eventually be able to carry out missions usually reserved for Special Operations forces.

“[Multi-Modal Vehicle] should be at least be able to substantially enhance the capabilities of Special Ops,” said Salton.

Thus far, Sandia has built and conducted limited testing on conceptual hardware, designating it a “mature concept.” Next on the list is to secure funding for the prototype and approach industry partners to turn the concept into reality.

Multi-Modal Vehicle does have its limitations. Because it sheds parts and material as it transforms from one mode to another, recovery is almost impossible — making every mission an expensive one-way trip.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Cleanliness analysis



if it makes you think of dust bunnies and balls of fluff, you’re on the wrong track. The particles discovered in technical cleanliness analysis can be as small as 5 µm. In the automobile industry in particular, component cleanliness is a key factor in the manufacturing process: as systems become more and more efficient and therefore more sensitive, even a tiny particle in a brake hose, the gears or the turbocharger can lead to damage or even to failure of the entire product. In close cooperation with automobile manufacturers, we therefore developed the Leica Cleanliness Expert, a complete system consisting of a microscope, digital camera and software for efficient cleanliness analysis.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Seeing Very Small Signs of Life



NASA has developed a nanoflow liquid chromatography technique designed to analyze miniscule dust particles from meteorites for key components of life. The chromatography instrument separates and sorts the molecules from a dust sample, and uses a spectrometer to analyze them for the presence of potential amino acids, the building blocks of protein.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

No Coincidence



Thanks to a group of physicists in Germany, a tabletop version of a new experimental spectrometer may be available to laboratories very soon. Electron coincidence spectroscopy is useful in studying the electronic properties of surfaces, for example, in superconductors. But, the technique required light from a large-scale synchrotron facility — until now. The researchers have been able to develop a much more portable instrument that uses short pulses of photons from an ultraviolet source.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

LADEE's First Moon Images



The Lunar Atmosphere and Dust Environment Explorer (LADEE) is designed primarily to take samples of the moon's very thin atmosphere, but it is also equipped with wide angle camera systems called star trackers. LADEE recently took five pictures with its star trackers, revealing the surface of the moon during lunar night while it was bathed in Earthshine (sunlight reflected from the Earth's surface). While the star trackers aren't designed to take the most detailed pictures, they sometimes can capture the moon's unique features.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

New Innovative VIS-NIR Spectrophotometer



The VIS-NIR Spectrophotometer (ArcSpectro VIS-NIR) is ideal for measuring diffuse reflectance of samples from 360 to 2500 nm. This combines a standard multichannel grating spectrometer and a scanning Fourier Transform Spectrometer with an extended InGaAS detector

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

New System Scans the Surface



A new deposition and analysis cluster from Oxford Instruments has a home at the James Watt Nanofabrication Centre in Glasgow, Scotland. The new system features a variety of tools and instruments designed to improve the energy efficiency of electronic and optoelectronic devices. The cluster includes equipment to conduct scanning Auger microscopy and scanning electron microscopy, providing high resolution images of micro- and nano-sized surface elements.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Spectrometer on a Chip Wins Award



Tornado Spectral Systems (TSS) has won a Prism Award for its spectrometer on a chip. The spectrometer uses the company's proprietary nanophotonics technology, which can reduce bench-top instruments to the size of a microchip. Developers of the award-winning spectrometer envision it will have a future in industrial non-destructive testing and healthcare applications as diverse as dentistry and ophthalmology.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Resolving Complex Surface Structures



A new metrology system features two complimentary techniques for non-destructive 3D surface profiling. The system combines confocal and interferometric optical profilers to provide the best of both technologies: high definition confocal microscopy offers high lateral resolution while interferometry offers sub-nanometer vertical resolution. The combination provides accurate surface analysis of materials and components.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Taking a Specimen's Temperature



The ASTM E1862 test methods cover procedures for measuring and compensating for reflected temperature when measuring the surface temperature of a specimen with an infrared imaging radiometer. These test methods may involve use of equipment and materials in the presence of heated or electrically energized equipment, or both.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Formula 1 will spearhead greener cars for us all



FORMULA 1 motor racing is hardly a hotbed of green activism. But if anyone knows a thing or two about squeezing the maximum amount of kinetic energy out of a litre of fuel, it is a Formula 1 engineer. Now the sport's governing body has decided to put this expertise to good use. As of this weekend, Formula 1 cars will be limited to Ford Focus-sized engines, concentrating brilliant minds on fuel efficiency issues likely to be relevant to the real world of family cars. The motives are not entirely altruistic: mainstream engine-makers have drifted away from Formula 1, claiming that its challenges are "irrelevant" to their core business. Formula 1 needs them back.

But that is no reason to sneer. To meet the grand challenges of the 21st century, environmental concerns need to break out of their ghetto. If Formula 1 gets petrolheads fired up about fuel efficiency, so much the better.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

How to survive a nuclear bomb: An update on "Duck and Cover"



The best advice for surviving a nuclear bomb is to be somewhere else when it goes off. If that doesn't work out for you, though, a recent study carried out at the US Department of Energy's (DOE's) Lawrence Livermore National Laboratory (LLNL) provides some simple guidance for maximizing your chances of survival.

In a world where terrorists or small states may be able to muster a 1-10 kiloton nuclear attack, a prudent individual might find it worthwhile to decide a course of action ahead of time. LLNL's Michael Dillon had been studying nuclear shelters for many years when his family asked what they should do if they see a nuclear mushroom rising over their city. Not having a good answer, he began putting together survival models. It turns out that the decades-long advice to shelter in place is not necessarily the best plan for survival.

The first step in surviving is making it through the initial detonation. A credible threat is a 5 kiloton pure fission explosion that detonates in a building at a height of 60 m (200 ft). The energy of the explosion is distributed between blast (about 50 percent), thermal radiation (about 35 percent) and ionizing radiation (about 5 percent in the initial burst, and about 10 percent in fallout.)

The expected death toll from prompt effects, such as blast, heat, and the initial burst of radiation, depends on the city and location within the city. Prompt effects from the New York city blast shown above would kill nearly a quarter million people, while if the same blast were centered on downtown Albuquerque (a spread-out small city with a well-defined downtown) the prompt death toll would be about 15,000.

Assuming one has survived these early effects (although you may not besure about survival until a couple of weeks have passed), the next object is to avoid being killed by radiation exposure from the fallout. Exposure to radiation is measured in a variety of units; here we will use rems (Roentgen equivalent man) and rems/hr. The lethal dose is about 500 rems. (Another commonly used unit for radiation exposure is the Gray (Gy), which is equal to 100 rem.)

The fireball is 760 ft (230 m) in diameter, large enough to touch the ground, producing substantial local fallout. The mushroom cloud climbs for a period of about five minutes, reaching an altitude of about 3 miles (4.8 km) and a diameter of about 2 miles (3.2 km). The total amount of radioactive material resulting from a 5 kt explosion is only about 1 lb (0.5 kg), but that material is extremely radioactive.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SECONDA PARTE

Fortunately, the fallout is not evenly spread. The most intense radiation products for this size of blast remain within the prompt kill zone of the bomb. Assuming a wind speed of 10 mph (16 km/h), the fallout is distributed over a narrow plume, as shown in the figure above. The area in which the radiation exposure rate is between 100 and 200 rem/h reaches about 4 mi (5.4 km) downwind of the detonation, but is only 0.33 mi (0.54 km) wide. For a rate of 10-100 rem/hr, the region is 15 mi (24 km) long and 1.5 mi (2.4 km) wide, and for a rate of 1-10 rem/hr, it is 26 mi (42 km) long and 2.8 mi (4.5 km) wide.

Should I stay or should I go?

Now comes the question: What will you do? The official US government guidance is to shelter in place. You go to the nearest and most protective building and stay there for 24 hours unless told to evacuate sooner.

This isn't bad advice if your immediate shelter is the basement of a more or less intact house, which can reduce radiation levels by a factor of ten or so. However, if the blast occurs in Los Angeles rather than in New York, the lack of a frost line allows most houses to be built without basements. Such houses only block about half the fallout radiation.

Taking into account the decay rate of the fallout, a location with an initial exposure rate of 200 rem/h (about the highest dose rate for fallout from a 5 kt device) will receive a total radiation exposure of over 600 rem in the first 24 hours. If sheltering in a NYC concrete basement, a person's exposure in this period would be about 60 rem, an exposure having little immediate health consequence. However, in an LA ranch house, over 300 rem would be absorbed in that same 24 hour period, which would prove an eventually lethal dose for a substantial number of victims receiving little or no medical care, particularly if combined with flash burns and blast debris injuries. It would appear that sheltering in place is not necessarily the best advice, depending on local circumstances.

SEGUE TERZA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

TERZA PARTE

Another factor to consider is, if the wind direction is constant, it shouldn't take long to leave the worst of the fallout plume by walking perpendicular to the plume. It also requires time for radioactive materials to fall to the ground, even near the blast site. This can provide a short period during which radiation exposure is not the greatest concern. Instead, dodging falling buildings, debris, and fires is likely the biggest risk in this early post-blast environment. Still, there is time and opportunity to take a different course than simply sheltering in place.

It is this opportunity that Mike Dillon decided to analyze. He developed a complex mathematical model describing the radiation exposures associated with different post-blast behaviors, then simplified the results so they can be used by anyone to choose a survival strategy.

The key factor is how long it would take to get to an adequate shelter. For a 5 kt blast, adequate shelter is essentially either a standing multi-story building (shelter is best in the mid-upper floors), near the center of a large concrete or brick building, or in a structurally sound basement.

If it would take you less than five minutes to reach adequate shelter, go there immediately following the immediate blast effects. In the end, the better shelter will offset your brief exposure to higher levels of radiation.

If adequate shelter is less than 15 minutes away, shelter in place for no more than 30 minutes, then transfer to the better shelter. Again, the combination of a short period of immediate shelter combined with a move to better shelter will offset your exposure during the move (at least statistically).

The choice of a 5 kt nuclear device was intended as something that could be made by terrorists or small states with a minimum of sophisticated design. As a result, if positioned more than a mile from the detonation, a person has a pretty good chance of surviving the attack. Even the area over which large areas of fallout land are quite small, with the 100 rem/h contour taking in about 1.5 sq mi (3.8 sq km).

The story would be quite different if a Minuteman or Trident-class weapon were stolen, as these have yields in the general vicinity of 300 kt. Such a device would have a prompt kill range of about 5 miles (8 km), causing the deaths of about a million people in the Empire State Building scenario. The 500 rem/h fallout contour includes some 50 sq mi (130 sq km), and the 100 rem/h region covers some 400 sq mi (1,036 sq km). Despite the enormous size differences, Dillon's rules still can help. It is just that many more people would die of fallout regardless of their course of action.

Should a nuke ever be detonated in a city anywhere in the world, the results would obviously be enormous, but of similar dimensions to the largest weather-related disasters and earthquakes. We would survive, but it would be a blow never to be forgotten. Let's hope we never go there.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Robot elephant trunk learns motor skills like a baby



I AM in Jochen Steil's lab, grasping a segmented, whiplashing tentacle that resists and tries to push me away. It feels strangely alive, as though I am trying to throttle a giant alien maggot. In fact, I am training a bionic elephant's trunk to do real-world jobs like picking apples or replacing light bulbs – something non-experts haven't been able to do until now.

Designed to bring the dexterity of an elephant's trunk to industrial robots, the appendage I am wrestling was launched by German engineering firm Festo as a proof-of-concept in 2010. The design showed that a trunk formed of 3D-printed segments can be controlled by an array of pneumatic artificial muscles.

But beyond a handful of motions, such as shaking hands – including once with German chancellor Angela Merkel – or grasping a bottle, the machine wasn't built with its own precision control software. "They deliver it without much control. You can try, but the arm will be centimetres from where it should be, which is no good," says Steil, an intelligent systems engineer at Bielefeld University, also in Germany.

That means people who aren't robot experts wouldn't be able to train it to carry out simple tasks, limiting its potential usefulness in the real world. But now Steil and his colleague Matthias Rolf have changed all that, as they told a human-robot interaction conference in Bielefeld last week.

They used a process called "goal babbling", thought to mimic the way a baby learns to grab things by continually reaching – a process of trial and error that lets them work out which muscles they need to move. Similarly, the robot remembers what happens to the trunk's position when tiny changes are made to the pressure in the thin pneumatic tubes feeding the artificial muscles. This creates a map that relates the trunk's precise position to the pressures in each tube.

The trunk can now be manually forced into a series of positions and learn to adopt them on command – in other words it can now be trained to repeat actions and pluck anything from light bulbs to hazelnuts.

I can vouch for that: as I move the bionic trunk in Steil's lab into different positions it initially resists, but then yields and follows my movement. The next time I try to push it to the same spot, it moves easily, because the behaviour has been learned. The robot now has muscle memory – which makes it seem even more alive.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Robot elephant trunk learns motor skills like a baby



I AM in Jochen Steil's lab, grasping a segmented, whiplashing tentacle that resists and tries to push me away. It feels strangely alive, as though I am trying to throttle a giant alien maggot. In fact, I am training a bionic elephant's trunk to do real-world jobs like picking apples or replacing light bulbs – something non-experts haven't been able to do until now.

Designed to bring the dexterity of an elephant's trunk to industrial robots, the appendage I am wrestling was launched by German engineering firm Festo as a proof-of-concept in 2010. The design showed that a trunk formed of 3D-printed segments can be controlled by an array of pneumatic artificial muscles.

But beyond a handful of motions, such as shaking hands – including once with German chancellor Angela Merkel – or grasping a bottle, the machine wasn't built with its own precision control software. "They deliver it without much control. You can try, but the arm will be centimetres from where it should be, which is no good," says Steil, an intelligent systems engineer at Bielefeld University, also in Germany.

That means people who aren't robot experts wouldn't be able to train it to carry out simple tasks, limiting its potential usefulness in the real world. But now Steil and his colleague Matthias Rolf have changed all that, as they told a human-robot interaction conference in Bielefeld last week.

They used a process called "goal babbling", thought to mimic the way a baby learns to grab things by continually reaching – a process of trial and error that lets them work out which muscles they need to move. Similarly, the robot remembers what happens to the trunk's position when tiny changes are made to the pressure in the thin pneumatic tubes feeding the artificial muscles. This creates a map that relates the trunk's precise position to the pressures in each tube.

The trunk can now be manually forced into a series of positions and learn to adopt them on command – in other words it can now be trained to repeat actions and pluck anything from light bulbs to hazelnuts.

I can vouch for that: as I move the bionic trunk in Steil's lab into different positions it initially resists, but then yields and follows my movement. The next time I try to push it to the same spot, it moves easily, because the behaviour has been learned. The robot now has muscle memory – which makes it seem even more alive.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

L’applicazione del procedimento del brevetto è utilissimo in quest’applicazione.



Clean Parts – More Reliable and Longer Lifetime



In the automotive industry, the technical cleanliness of function-critical individual and system components has become an increasingly critical criterion for reliability and service life. This trend is also reflected in ISO/DIS 16232 (road vehicles – cleanliness of components of fluid circuits). Microscope systems with corresponding analytical software enable efficient and reliable residual dirt analysis of injectors, pumps, control units and other micromechanical components.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Beyond Graphene: More "Miracle Materials" Coming



As important as graphene's discovery has been, it was like the starting gun. Introduction of other miracle materials is accelerating, partly due to processes improved during graphene's development. Just months ago, tin sheets measuring one-atom thick were touted as the next super material. Now, two more potential superstars emerge. Cornell University has announced "few-layer phosphorene," a 2D atomic-sheet structure like graphene, but with conductive and insulative properties graphene doesn't exhibit. Also, scientists have unveiled artificial graphene composed of semiconductor crystals instead of carbon. Researchers believe this could lead to nanomaterials with "tunable properties."

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

World's Biggest Solar Plant Opens



It features almost 180,000 heliostats, and it's big enough to be seen from orbit by satellites. It generates almost 400 MW of power, even while using less water than two holes at a nearby golf course, but it's also killing birds at an alarming rate. The Ivanpah Solar Electric Generating Station is the largest concentrating solar power facility in the world. And after starting construction almost eight years ago, it's finally online. The 3,500 acre California facility is a collaboration between NRG, Google, BrightSource Energy, and the U.S. Department of Energy.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Nanotube-based Cables Four Times Better
Than Copper



Could carbon nanotube-based fibers be used to build lighter high-voltage power lines? Tests show these fibers have four times more power-handling capacity than copper cables of the same mass. Science Daily says the findings could be especially helpful in aerospace applications, where lightweight cables are crucial, and that the cables could even be used to power unmanned aircraft from the ground

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

UK Power Goes Under



ABB is diving into the offshore waters of the UK by agreeing to install a submarine AC power cable system for the Dudgeon offshore wind farm near Norfolk. The $58 million project features two 132 kV, three-core cables, each 42 km long, connecting the wind farm's offshore substation to Weybourne Hope.

Marco La Rosa ha detto...
Questo commento è stato eliminato dall'autore.
Marco La Rosa ha detto...

DA DOTT. COTELLESSA

More Power from Light Winds

Would a 20% increase in low-wind performance improve your bottom line? Two Rutgers researchers have developed a blade deflector that takes advantage of tangential forces wasted by typical blades. Laboratory-scale models confirm computer simulations, and the technology was recently licensed to a commercial technology vendor. Even a fractional 5% power increase translates to a 20% profit improvement.



Impact of Wind Turbines on Radar
Returns



Wind turbines located near a radar installation can significantly interfere with the ability of the radar to operate properly. Remcom has used our tools and expertise in radar scattering to perform a number of research efforts into the impact that wind turbines and wind farms have on radar returns for Air Traffic Control (ATC) radar, early warning radar, weather radar, and instrumentation radar.



Distributed Wind — What's Holding It Back?



Net metering brought distributed solar energy to the grid, but distributed wind generation has been less successful. According to Renewable Energy World, the primary barriers to neighborhood wind are noise, appearance, and durability. Noise control is possible, but requires tradeoffs between efficiency, blade speed, and storm resistance. Appearance may be the real hurdle, since turbine towers rarely blend into the landscape.



Wind Turbine Slip Rings



United Equipment Accessories' wind turbine slip rings are engineered to perform up to 75 million revolutions. UEA slip rings offer design versatility and a wide selection of circuitry. Our engineers will work with you on your specific wind turbine application to provide the highest performing quality slip ring available!



Reliable Fiber Optics Solutions for Wind Turbines

Key applications for industrial fiber optic components in wind turbine systems include: power electronic gate driver for rectifiers and inverters, control and communication boards, turbine control units, condition monitoring systems, and wind farm networking



Blade Flaps Yield SMART Results



Fixed airfoil blades may soon be a thing of the past. Tests of the SMART rotor — three 9 m turbine blades with trailing edge flaps along 20% of the blade length — promise improved strain relief and vibration damping capabilities. In one Sandia National Labs test, blade load reductions of 14% were noted, although actuation times were challenged by delays in the actuator mechanisms and electronics.



Cold Climate Considerations



Cold climates, with their high winds and residential energy requirements, can be very attractive to wind farm operators. Unfortunately, icing is a major threat to blades, towers, and even data cabling and transmission lines. With 57% of new generating capacity planned for ice-prone regions, North American Clean Energy expects the International Energy Agency to increase its focus on cold-climate research.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Graphene smart contact lenses could give you thermal infrared and UV vision



A breakthrough in graphene imaging technology means you might soon have a smart contact lens, or other ultra-thin device, with a built-in camera that also gives you infrared “heat vision.” By sandwiching two layers of graphene together, engineers at the University of Michigan have created an ultra-broadband graphene imaging sensor that is ultra-broadband (it can capture everything from visible light all the way up to mid-infrared) — but more importantly, unlike other devices that can see far into the infrared spectrum, it operates well at room temperature.

As you probably know by now, graphene has some rather miraculous properties — including, as luck would have it, a very strong effect when it’s struck by photons (light energy). Basically, when graphene is struck by a photon, an electron absorbs that energy and becomes a hot carrier – an effect that can be measured, processed, and turned into an image. The problem, however, is that graphene is incredibly thin (just one atom thick) and transparent — and so it only absorbs around 2.3% of the light that hits it. With so little light striking it, there just aren’t enough hot carrier electrons to be reliably detected. (Yes, this is one of those rare cases where being transparent and super-thin is actually a bad thing.)

Zhaohui Zhong and friends at the University of Michigan, however, have devised a solution to this problem. They still use a single layer of graphene as the primary photodetector — but then they put an insulating dielectric beneath it, and then another layer of graphene beneath that. When light strikes the top layer, the hot carrier tunnels through the dielectric to the other side, creating a charge build-up and strong change in conductance. In effect, they have created a phototransistor that amplifies the small number of absorbed photons absorbed by the top layer (gate) into a large change in the bottom layer’s conductance (channel). In numerical terms, raw graphene generally produces a few milliamps of power per watt of light energy (mA/W) — the Michigan phototransistor, however, is around 1 A/W, or around 100 times more sensitive. This is around the same sensitivity as CMOS silicon imaging sensors in commercial digital cameras.

The prototype device created by Zhong and co. is already “smaller than a pinky nail” and can be easily scaled down. By far the most exciting aspect here is the ultra-broadband sensitivity — while the silicon sensor in your smartphone can only register visible light, graphene is sensitive to a much wider range of wavelengths, from ultraviolet at the bottom, all the way to far-infrared at the top. In this case, the Michigan phototransistor is sensitive to visible light and up to mid-infrared — but it’s entirely possible that a future device would cover UV and far-IR as well. There are imaging technologies that can see in the UV and IR ranges, but they generally require bulky cryogenic cooling equipment; the graphene phototransistor, on the other hand, is so sensitive that it works at room temperature. Now, I think we can all agree that a smartphone that can capture UV and IR would be pretty damn awesome — but because this is ultra-thin-and-light-and-efficient graphene we’re talking about, the potential, futuristic applications are far more exciting. For me, the most exciting possibility is building graphene imaging technology into smart contact lenses. At first, you might just use this data to take awesome photos of the environment, or to give you you night/thermal vision through a display built into the contact lens. In the future, though, as bionic eyes and retinal implants improve, we might use this graphene imaging tech to wire UV and IR vision directly into our brains.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Smarter HyQ robot squat-jumps and does flying trots



HyQ is the latest attention-getter in robotics, a torque-controlled robot that takes its name from its being a "Hydraulically actuated Quadruped." The robot is from the Department of Advanced Robotics at the Istituto Italiano di Tecnologia (Italian Institute of Technology or IIT). A video that demonstrates their progress includes notes that "there are no physical springs in the legs or body of HyQ; all compliance results from active adjustment of stiffness and damping by software."

A detailed look at HyQ in Friday's IEEE Spectrum says that though HyQ was already introduced in the past, where its engineers taught the system some moves, a smarter HyQ is back "with even more tricks." Claudio Semini, the team leader, provided a history of their work on his web page, saying, "After extensive testing of a first leg prototype in 2008 and 2009, the first prototype of HyQ was operational in 2010. Since then, we have been gradually improving the robot hardware and software and we implemented several locomotion modes." Namely, the team has given HyQ the ability to move with more versatility and maintain greater autonomy. The quadruped can walk, trot, and run, and is showing a range of motion skills that make it possible to negotiate difficult conditions.

HyQ motion skills include planned motion over uneven terrain to highly dynamic motions. Some highlights of the robot's skills are: walking over uneven terrain, balancing under disturbances, surviving being slammed into by a boxing bag, performing a flying trot, squat jump, and shows of adjustable stiffness and damping.

IIT's project notes on the HyQ said that each leg features three degrees of freedom, two in the hip (abduction/adduction and flexion/extension) and one in the knee (flexion/extension). The leg is built of a light-weight aerospace-grade aluminium alloy and stainless steel. High resolution encoders and load cells in each joint allow a smooth control of position and torque.

The researchers also presented a to-do list that will translate the robot's motion capabilities into practical applications. The HyQ team wants the robot to accomplish activities such as search and rescue, operating in contaminated areas, firefighting, and, for scientists, serving as an experimental platform for researchers exploring legged locomotion, biomechanics, force control and autonomous navigation.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Cheaper by the Dozens



Thanks to a new microscopy technique, scientists could soon generate snapshots of dozens of different biomolecules at once in a single human cell. Using super-resolution microscopy, combined with special fluorescent proteins that tag cellular components, Harvard University researchers captured clear images of 10 different types of miniscule DNA origami structures in one image. This technique, called Exchange-PAINT, gives biologists an important new tool to understand how multiple cellular components work together in complex pathways.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Pushing the Limits of Spectroscopy



Scientists at major labs are praising the capabilities of the powerful new mass spectrometers recently introduced by Agilent. A development analyst at Netherlands-based Eurofins Analytico says the new 7900 ICP-MS "allows us to easily switch between trace element analysis of demineralized water to high total dissolved solid digests all in the same sequence".


Marco La Rosa ha detto...

DA DOTT. COTELLESSA

How to Chart DNA Repair



Left unrepaired, DNA damage can lead to cell death, so accurate measurement of DNA repair enzymes is essential to evaluating and predicting treatment for cancer and other diseases. NIST researchers have developed a novel approach for identifying and quantifying the APE1 enzyme in human cells. Their tool: liquid chromatography mass spectrometry with isotope dilution. This journal article provides the details on this methodology, which will be tested in future studies of patient samples.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Metabolomics is increasingly employed in oral cancer research. To study metabolic biomarkers of oral squamous cell carcinoma (OSCC) metastasis from cell lysates, a highly analytical and sensitive platform coupling capillary ion chromatography (Cap IC) with the Q Exactive mass spectrometer has been developed.

Cap IC is a complementary separation technique that provides superior resolution and analytical sensitivity of polar metabolites, combined with high resolution and accurate mass measurement (HR/AM) capabilities to differentiate isobaric metabolites.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

GaN Replaces Silicon One Application at a
Time

In this PowerPulse.Net article titled "GaN — Still Crushing Silicon One Application at a Time", Efficient Power Conversion Corp. CEO Alex Lidow examines the rapidly evolving trend of conversion from silicon based power MOSFETs to gallium nitride (GaN) transistors in emerging applications. These include highly resonant wireless power transfer, RF envelope tracking, and class-D audio amplifiers. This is just the start.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

GaN Replaces Silicon One Application at a
Time

In this PowerPulse.Net article titled "GaN — Still Crushing Silicon One Application at a Time", Efficient Power Conversion Corp. CEO Alex Lidow examines the rapidly evolving trend of conversion from silicon based power MOSFETs to gallium nitride (GaN) transistors in emerging applications. These include highly resonant wireless power transfer, RF envelope tracking, and class-D audio amplifiers. This is just the start.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SiGe Transistor Sets New Switching Speed
Record

EDN reports that researchers from the IHP institute (Frankfurt an der Oder, Germany) and the Georgia Institute of Technology (Atlanta, Georgia) have jointly developed the world's fastest silicon germanium (SiGe) transistor. Made in Germany using a 130 nm BiCMOS process, it demonstrated an fmax of about 800 GHz at a temperature of 4.3 K and with a nominal breakdown voltage. At room temperature, the same transistor operated at half the maximum switching frequency.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Opening a new window into quantum physics


A team of University of Toronto physicists led by Alex Hayat has proposed a novel and efficient way to leverage the strange quantum physics phenomenon known as entanglement.

The approach would involve combining light-emitting diodes (LEDs) with a superconductor to generate entangled photons and could open up a rich spectrum of new physics as well as devices for quantum technologies, including quantum computers and quantum communication.

Entanglement occurs when particles become correlated in pairs to predictably interact with each other regardless of how far apart they are. Measure the properties of one member of the entangled pair and you instantly know the properties of the other. It is one of the most perplexing aspects of quantum mechanics, leading Einstein to call it "spooky action at a distance."

"A usual light source such as an LED emits photons randomly without any correlations," explains Hayat, who is also a Global Scholar at the Canadian Institute for Advanced Research. "We've proved that generating entanglement between photons emitted from an LED can be achieved by adding another peculiar physical effect of superconductivity - a resistance-free electrical current in certain materials at low temperatures."

This effect occurs when electrons are entangled in Cooper pairs – a phenomenon in which when one electron spins one way, the other will spin in the opposite direction. When a layer of such superconducting material is placed in close contact with a semiconductor LED structure, Cooper pairs are injected in to the LED, so that pairs of entangled electrons create entangled pairs of photons. The effect, however, turns out to work only in LEDs which use nanometre-thick active regions – quantum wells.

"Typically quantum properties show up on very small scales – an electron or an atom. Superconductivity allows quantum effects to show up on large scales – an electrical component or a whole circuit. This quantum behaviour can significantly enhance light emission in general, and entangled photon emission in particular," Hayat said.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Solare termodinamico: nuovi sviluppi tecnologici grazie ad un nuovo componente brevettato dall’ENEA



Nell’ambito dello sfruttamento dell’energia solare ad alta temperatura, conosciuto come solare termodinamico, l’ENEA sta sperimentando una nuova soluzione tecnologica per consentire sia l’accumulo che il trasporto del calore. Si tratta di un nuovo componente, appena brevettato, che integra nel serbatoio di accumulo del calore anche un generatore di vapore.

L’energia termica assorbita dalla fonte solare viene accumulata e conservata per molte ore ad una temperatura di 550°C grazie alla miscela di sali fusi contenuta all’interno del serbatoio. Il generatore di vapore, che è inserito nel serbatoio ed è immerso nella miscela di sali fusi, assorbe il calore dai sali stessi e lo utilizza per produrre vapor d’acqua. Il vapore viene successivamente inviato in una turbina per la produzione di energia elettrica.

I test vengono effettuati su un prototipo di piccole dimensioni presso l’Impianto sperimentale Prova Collettori Solari - PCS, su cui vengono condotte prove in condizioni reali di esercizio dei componenti per le centrali solari.

Allo svolgimento dei test di sperimentazione, che hanno avuto luogo presso il Centro Ricerche ENEA di Casaccia, è venuta ad assistere anche una delegazione di ricercatori del Fraunhofer Institute (Germania), partner dell’ENEA per diversi progetti di ricerca europei, interessati allo sviluppo della tecnologia.

Si tratta di un ulteriore progresso della tecnologia che ha consentito all’ENEA di sviluppare il “Progetto Archimede”, inizialmente diretto dal premio Nobel Carlo Rubbia, un sistema che riesce a concentrare, per mezzo di specchi parabolici, la luce diretta del sole su un tubo ricevitore. All’interno di questo tubo viene fatta circolare una miscela di sali fusi per trasportare l'energia solare che, una volta accumulata, viene utilizzata per produrre vapore e quindi energia elettrica

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Metal Layered Shrink Wrap Makes Fluorescent Markers Glow Brighter



Many current methods of detecting pathogens and biomarkers involve fluorescent particles that bind to their targets that then can be spotted using photodetectors while being in excited state. While this technique is continuing to revolutionize diagnostics and laboratory work, it’s often limited by the faintness of the fluorescence when small numbers of particles are being detected. One way to improve the sensitivity of fluorescent markers is to make them produce a stronger signal. Researchers at University of California, Irvine are reporting in journal Optical Materials Express the development of a new technique that significantly improves the brightness of fluorescing nanoparticles.

The technique involves applying layers of gold and nickel onto shrink wrap, the kind you have in your kitchen. When heated, the material wrinkles and compresses, creating a flower-like structure. Samples of goat anti-mouse immunoglobulin antibodies were tagged with fluorescent markers and placed on top of the new material. When tested in a laboratory setting, the resulting fluorescence in the near-infrared range was three orders of magnitude greater than without the metal enhanced fluorescence.



This is the first demonstration of leveraging the plasmons in such hybrid nanostructures by metal enhanced fluorescence (MEF) in the near-infrared wavelengths. We observed more than three orders of magnitude enhancement in the fluorescence signal of a single molecule of goat anti-mouse immunoglobulin G (IgG) antibody conjugated to fluorescein isothiocyanate, FITC, (FITC-IgG) by two-photon excitation with these structures. These large enhancements in the fluorescence signal at the nanoscale gaps between the composite wrinkles corresponded to shortened lifetimes due to localized surface plasmons. To characterize these structures, we combined fluctuation correlation spectroscopy (FCS), fluorescence lifetime imaging microscopy (FLIM), and two-photon microscopy to spatially and temporally map the hot spots with high resolution.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

DARPA shows off three new concepts for its VTOL X-Plane



As far as DARPA is concerned, aircraft with vertical takeoff and landing (VTOL) capabilities are the future of versatile, efficient flight. The V-22 Osprey took a crack at it, but over the next decade, we'll need something better. Concept after concept has come across DARPA's desks, but none have measured up. The problem, as DARPA sees it, is that nobody has yet been able to create a faster-than-ever VTOL aircraft without significantly sacrificing the aircraft's usefulness and range.

To find a solution, DARPA put out an open call to designers, birthing the VTOL X-Plane project. Out of the pool of concepts that came their way, DARPA has rounded up a veritable A-team of aviation companies: Boeing, Sikorsky Aircraft, Aurora Flight Sciences, and Karem Aircraft. These four, awarded with prime contracts for Phase 1 of the VTOL X-Plane's development, still have a huge amount of work to do.

The VTOL X-Plane's final design will need to meet some impressive criteria. The plane will have to sustain speeds of 345-460 MPH, have a carrying capacity of at least 40 percent of its own weight, and increase hover efficiency while also reducing drag. Basically, in every way possible, the VTOL X-Plane needs to blow current tech out of the water.

As for the designs, all four companies chose to draw up plans for unmanned UAVs, but manned vehicles could be in the works as well. Since this is still the first phase of the aircraft's development, all DARPA has to show off are artists' concepts, and Aurora Flight Sciences hasn't even provided that much. Whatever the VTOL X-Plane eventually looks like, you can rest assured, it's still got a long way to go (several years, at least) before we see it take flight.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Super-Cheap Paper Microscope Could Save Millions of Lives



Imagine if clinics in developing countries were equipped with an inexpensive yet durable tool that could help medical personnel identify and diagnose a variety of deadly diseases like Malaria, Chagas disease, or Leishmaniosis? For millions of people around the world waiting to be diagnosed and treated, such a tool could be a life-saver.

Manu Prakash, a professor at Stanford University and his students have developed a microscope out of a flat sheet of paper, a watch battery, LED, and optical units that when folded together, much like origami, creates a functional instrument with the resolution of 800 nanometers – basically magnifying an object up to 2,000 times.

Called Foldscope, the microscope is extremely inexpensive to manufacture, costing between fifty-cents and a dollar per instrument. And because the microscope is assembled primarily from paper and optical components the size of a grain of sand, it is virtually indestructible.

Foldscope also differs from the microscopes typically found in science labs because it’s not only portable, but it also has the ability to project an image on any surface, allowing a larger group of people the ability to look at an image simultaneously.

Prakash is hoping that because the Foldscope is so cheap to manufacture and easy to assemble that everyone will have access to the world of microscopy and one day every kid will have a Foldscope in their backpacks or tucked away in their pocket.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

A Wind Turbine of Great Stature



Vesta's prototype V164-8.0 MW wind turbine recently generated its first power at the aptly named Danish National Test Centre for Large Wind Turbines. At a tip height of 220 m (721 ft), what's considered the world's most powerful wind turbine can provide electricity for 7,500 European households. The units are slated for production in 2015, primarily for offshore wind farms.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

"Bionic Hand" Incorporates Sensory Feedback



In what could fulfill a dream in the field of prosthetics, scientists have for the first time restored an amputee's sense of touch. Sensors in a prototype advanced bionic hand measure the tension in artificial tendons that operate the fingers and other parts of the prosthesis. The sensors convert the measured forces to electrical currents, which computer algorithms then process into neural impulses that feed to trans-neural electrodes surgically implanted in the patient's arm. This biofeedback lets the patient feel an object's shape as well as its hardness or softness.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

About FlowCath®



FlowCath® is a technology that promises to make proton exchange membrane (PEM) fuel cells competitive with conventional electricity generators, such as diesel generators, by replacing the fixed platinum catalysts on the cathode with a liquid regenerating catalyst system. The liquid is continuously pumped through the fuel cell stack into an external regenerator and then back to the stack.

The technology reduces platinum content by up to 80% and simplifies the overall fuel cell system. As a consequence the technology not only radically reduces cost, it also improves durability and robustness of the system.

ACAL Energy’s platinum free FlowCath® technology addresses the inherent limitations of conventional PEM fuel cells, by applying a completely novel chemistry based innovation.

Hydrogen is catalysed on the anode in the conventional fashion. However unlike conventional technology, the

electron and proton are absorbed into a solution containing redox catalyst systems, which flow continuously from the stack to an external regeneration vessel.


In the regenerator, the catholyte comes into contact with air and the electron, proton and oxygen from air react to

form water, which exits the regenerator as vapor. The catholyte then flows back to the cell.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

ACAL Energy’s FlowCath® technology reduces the cost of the standard PEM

based fuel cell stack and system components by:

Eliminating all platinum on the cathode

Simplifying the balance of plant

And removing the membrane and catalyst degradation mechanisms

Costs Below $40/kW

It is a well known industry fact that the oxygen side of a PEM system causes no end of technical challenge; by using a redox liquid based catalyst system to replace the platinum, FlowCath® reacts with oxygen outside the

stack, so that it never comes in contact with the MEA, thereby simplifying the technical challenge of durability and resulting in reduced costs.

Although platinum is still required on the anode, the cathode represents up to 80% of the platinum used in a conventional stack and at target costs/kW Pt represents 20-25% of the total cost of a system; FlowCath®

replaces this with a liquid solution that is orders of magnitude cheaper.

The cost saving associated varies as Pt prices escalate, but is worth $8/kW at $1600/troy oz.

In built durability

In practical use, the flowing redox liquid catalyst has proven to be exceptionally robust; possibly due to both the nature of the catalyst itself, which is in its homogenous state when used in the system, but also due to the stack design. Unlike conventional stacks the overall power performance is not dependent on the “weakest cell in the stack” and has been demonstrated to be far more about engineering and managing flow rates. The estimated savings for the inherent durability of the

FlowCath® technology is worth $5/kW.

SEGUE TERZA PARTE


Marco La Rosa ha detto...

DA DOTT. COTELLESSA

ACAL Energy’s FlowCath® technology has been shown to enable costs below $40/kW for mass market automotive applications; however due to the technology’s inherent ability to address durability issues found in conventional PEM systems, the same system design can be applied to

stationary power applications – presenting a ground breaking opportunity to meet cost down targets in applications where 40,000 hrs operation is required.



Inherent durability



The oxygen (cathode) side of a PEM system is the root cause of most of the MEA degradation issues in platinum based stacks. By using a redox liquid based catalyst system, FlowCath® technology reacts with all available oxygen in a separate “regenerator” outside of the stack, with no peroxide formation, no agglomeration, no hot spots and the MEA constantly humidified.

Over 5000 hrs of a rigorous automotive test cycle, this phenomenal durability was borne out, where absolutely zero deterioration of the cell performance was seen. Under similar conditions a standard automotive

MEA would see 28 µV/h of losses1.



For stationary power applications where the application demands constant, full power loads over 40,000 hrs (10 years of product lifetime) different MEA configurations are used with thicker (more resistive) membranes and higher catalyst loadings, so that the equivalent decline is 7 µV/h

and the stack is designed with 12% loss of performance by end of life.

The compromise for this durability with conventional PEM technology is increased capital cost (higher upfront cost of system), increased operational costs (from reduced efficiency) and increased maintenance cost (to replace stacks during the system lifetime). None of these compromises are necessary with FlowCath®, saving over $500 in the upfront cost of a fully commercial 100kW system.

One power module both applications

With inherent durability, FlowCath® technology for the automotive application, scaled at 20-100kW, can be applied to stationary power, with the same membrane thickness, and the same efficiency. The catalyst system

will remain durable for the life of the application reducing maintenance time and loss of system availability. Not even hydrocarbon combustion technology can do this!

SEGUE QUARTA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

mproved Heat recovery

The low operating temperature of conventional PEM fuel cells limits the amount of heat that can be effectively utilized in combined heat and power (CHP) applications.



Instant response for demand management

A CHP or stationary generation system designed with FlowCath®

technology will have a “reserve of oxidant”, similar to a redox battery, so that response to load is not restricted by air supply from a compressor or blower and can provide power instantly. At times of high production and

low demand, this immediacy is worth $00’s per kW per sec for commercial scale systems, particularly in conditions of tight electricity supply, if used within smart grids and joined to renewable energy technologies, it will

decrease peak price and provide instant demand response. Since electrical generation and transmission systems are generally sized to correspond to peak demand (plus margin for forecasting error and unforeseen events),

lowering peak demand reduces overall plant and capital cost requirements.



Field Application &

Demonstration systems

FlowCath® technology has been installed and

demonstrated 100s of hours of beneficial usage

at the Solvay Interox Chemical Plant, UK, and

will soon be installed at the UKs first open access

H2 refuelling station located at Honda of the UK

Manufacturing site in Swindon.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Guardian 2000: il drone-poliziotto



Addio privacy. Arriva il “Guardian 2000”, il drone che ti spia dal cielo. Il drone in realtà già vola nei cieli italiani ma sarà presentato ufficialmente durante il “Roma Drone Expo&Show”, nel weekend del 24-25 maggio 2014 allo Stadio Alfredo Berra di Roma. La manifestazione prevede workshop su “Droni e sicurezza”, a cui sono stati invitati esperti delle Forze Armate, delle Forze dell’Ordine, dei Corpi armati dello Stato e della Protezione Civile. Guardiano 2000 è stator realizzato da AD Precision Mechanics (ADPM), una startup di Roma, il drone è un mini-aeroplano telecomandato, su cui è possibile montare una videocamera o una telecamera. Il dispositivo è in grado di effettuare inquadrature dall’alto in alta risoluzione, anche di notte. Costruito con materiali leggeri e con un'apertura alare di quasi 2 metri, può raggiungere i 45 km orari di velocità e la sua batteria dura per circa 45 minuti.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Analyses of anticancer drugs by capillary electrophoresis: a review

Capillary electrophoresis is a fast, inexpensive and low detection limit technique for the analysis of anticancer drugs. It has been used to analyze various anticancer drugs in biological samples, pharmaceutical preparations and environmental matrices. It has also been used to detect various cancer biomarkers in cancer patients. The present article describes the state-of-the art of capillary electrophoresis for the analyses of anticancer drugs. Various drugs discussed belong to several groups such as antimitotic agents, nucleoside analogs, antibiotics, topoisomerase inhibitors and DNA intercalating agents. In addition, efforts have also been made to discuss sample preparation, applications of capillary electrophoresis in genomic research, optimization and future perspectives.



Cancer is the second most lethal disease after cardiovascular diseases and is responsible for million of deaths worldwide. About 178 drugs are used for treating different types of cancers in human beings. These drugs are classified as alkylating agents (nitrogen mustards, nitrosourea), antimetabolites (5-florouracil, fludarabine), antibiotics (bleomycin, mitomycin), alkaloids (vinca alkaloids), anthracyclines (daunorubicin, doxorubicin) and taxanes (paclitaxel, docetaxel). In addition to these, about 10% of metal based drugs are also either in use or in the clinical trial. The majority of the drugs sold in the market are a combination of drug salts and excipients. The counter-ion may be organic (citrate, maleate, mesylate, succinate, salicylate, tartrate, gluconate, fumarate) or inorganic (hydrochloride, sulfate, phosphate, nitrate, bromide, carbonate) in nature, giving polarity to drug molecules. The ionic natures of these drugs make capillary electrophoresis (CE) as the method of choice for quality control in pharmaceutical industries. In addition, metal based anticancer drugs can be separated easily owing to the positive charge on metal ions. Generally, concentration of anticancer drugs in human plasma is low owing to fast metabolism in cancerous cells. Therefore, CE may be a good choice for the determination of anticancer drugs. This is possible due to the low detection limit of CE. In addition, CE is a very fast, economical and efficient technique. A thorough search of the literature was carried out and it was observed that CE is becoming more popular among oncologists because of the above advantages. Thousands of impurities are present in the biological samples along with the concerned anticancer drugs, which has compelled scientists to develop suitable sample preparation methods. Consequently, solid-phase extraction and semi-solid-phase extraction methods have been developed and used. During the search of the literature, it was observed that much work is carried out on CE analyses of anticancer drugs. Our observation and experience inspired this review article on the analyses of anticancer drugs by CE. The analysis of anticancer drugs in pharmaceutical formulations and biological samples is an integral part of developing safe and economical anticancer drugs. In addition, analyses of these drugs in biological samples by CE provide a clue for drug dose in human beings and rate of metabolism. In view of these facts, it was considered worthwhile to describe the state-of-the-art of analyses of anticancer drugs by CE. The present article describes various aspects, such as sample preparation, analyses, optimization, application and future perspectives.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Sample preparations

Sample preparation is an important aspect of the analysis of drugs in biological samples. Basically, samples collected from different sources cannot be loaded directly onto the system. A biological sample may contain thousands of amino acids and proteins along with other entities such as ions and plasma. Thus, it is essential to remove (through pre-concentration) these unwanted molecules from samples. Some methods are available to tackle this issue such as liquid–liquid extraction (LLE), liquid–solid extraction (LSE) and solid-phase extraction (SPE). Pérez-Ruiz mixed acetonitrile (50 μL) with blood serum (50 μL) and vortexed. The mixture was centrifuged and the supernatant collected, which was diluted with deionized water and used for CE analyses. Jiang studied the concentration of carcinoembryonic antigen (CEA) in serum sample, which is the characteristic of cancer patients. Normal human and cancer patients' plasmas were diluted 1 and 150 times with phosphate buffer saline (PBS) before injected into capillary electrophoresis hyphenated with chemiluminescence (CE-CL). Elhamili and Bergquist described a SPE method for sample preparation of imatinib in human plasma prior to loading onto a capillary electrophoresis hyphenated with electron spin ionization time of flight mass spectrometer (CE-ESI-TOF-MS). The authors used a combination of LLE and strong cation exchange solid-phase extraction (SCX-SPE) for the extraction of drug from human plasma. Briefly, plasma samples (both from patient and spiked with drug) containing d8 as internal standard were extracted using a mixture of 0.2 m NaOH and hexane–ethyl acetate. The organic phase was collected, dried and reduced to 250 μL. The samples were vortexed to precipitate out proteins and release the drug (imatinib). The samples were diluted with ammonium acetate (NH4Ac) buffer (pH 4.5). C18cartridges were first conditioned with 1.0 mL methanol followed by NH4Ac buffer to remove polar and nonpolar inferences, respectively. Finally, the analytes were eluted with 1.0 mL methanol–ammonia. Soliman analyzed prostate cancer biomarkers in human urine sample. The optimized SPE procedures included urine sample treatment with 0.1 m acetic acid and spiking with the sarcosine standards. A 30 mg Strata-X strong cation mixed mode cartridge was equilibrated and conditioned with methanol and acetic acid, respectively. The treated urine sample was then loaded onto the cartridge and drained under vacuum. Cartridges were washed with acetic acid and methanol and dried. The analytes were eluted with a mixture of methanol–ammonium hydroxide solution. The eluted samples were concentrated to 20 μL. At this stage, the samples were ready to load onto the CE machine. Cheng reported SPE and triple-stacking CE for analysis of methotrexate and its eight different metabolites in cerebrospinal fluid. The blood samples were denatured by adding perchloric acid followed by vortex and centrifugation. The extraction was carried out using an SPE cartridge, conditioned with MeOH and water. The same mixture of solvents (water and methanol) was used to elute the analytes.

SEGUE TERZA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Analyses of anticancer drugs

After the administration of drugs into the human body, several transformations (oxidation, hydrolysis, etc.) take place. Some drugs metabolize, leading to by-products. In addition, some drugs convert into active metabolites for their pharmaceutical actions. Therefore, it is an important aspect to study step by step transformation of these drugs in biological samples. Normally, drugs show their actions via interactions with proteins. CEs hyphenated with various detectors possess unambiguous advantages over spectrophotometry in studying protein–drug interactions. This is due to their ability to individually monitor the free and protein-bound drug species with relatively low limits of detection. Owing to this benefit, the hyphenated CE has made remarkable advancements in the area of cancer research over the past few years. Analyses of some anticancer drugs in different biological samples are discussed below.

Biological samples

Normally, blood, urine, tissues, saliva and cerebrospinal fluid (CSF) are the most important matrices for determining the presence of any medicine. The same is true with anticancer drugs. CE analyses of some anticancer drugs in biological samples are discussed below.

SEGUE QUARTA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Blood

Blood is a carrier of all drugs, and provides the metabolic fate of a drug. Generally, the serum proteins are the first possible drug binders. Similar to sarcosine, CEA is a biomarker present in the human serum. CEA is a glycoprotein associated with certain cancers. The presence of this protein is an indication of tumor in the body. As the specific tumor marker of the colorectal cancer, the positive rate of CEA usually reaches as high as 70–90%. CEA level in human serum is also related to lung and breast cancers . Therefore, the determination of CEA level in human serum is important, providing an early diagnosis of disease. In addition, it gives information on progression and monitoring patient condition after therapy. Recently, Jiang reported a highly sensitive noncompetitive immunoassay-based gold nanoparticles (AuNPs) amplified CE-CL detection for the determination of CEA. The authors used AuNPs as an amplified platform for immuno-reaction – a highly sensitive CL detection. The CE method comprised a 5 min running time with 0.034 ng/mL as the detection limit. The developed method was used for the analysis of CEA in human serum. Yan-Ming developed a capillary electrophoresis hyphenated with chemiluminescence immunoassay (CE-CL-IA) method using AuNPs, conjugated with antibody (Ab) to form tagged antibody (Ab*). AuNPs have been used as protein labeling reagents owing to their excellent catalytic effects to CL reaction of luminol and hydrogen peroxide (H2O2). Elhamili and Bergquist reported quantitative analysis of imatinib in human plasma by CE-ESI-TOF-MS. Nucleoside analogs form a class of drugs widely used in chemotherapy. The structural formulae of nucleosides, consisting of sugar and base moieties. Nucleosides are the natural building blocks of DNA and RNA. Sometimes, scientists have modified nucleosides (nucleosides analogs) to increase their potencies. Nucleoside analogs are metabolized through the same pathways as their natural counterparts, but block these pathways in advanced stages and hence disrupt the growth of rapidly growing cancerous cells. Therefore, these analogs are also called antimetabolites. 6-Thioguanine (6-TG) and 6-mercaptopurine (6-MP) are well-known anticancer drugs. They serve as prodrugs and require intracellular activation to thiopurine nucleotides to exert cytotoxicity. Tomkova reported a fast and selective capillary electrophoresis hyphenated with an ultraviolet spectrometer (CE-UV) method for the determination of TPMT enzyme activity in erythrocytes. The authors employed a range of buffers, e.g. alkaline buffer (pH 8.7–9.7) consisting of 80 mm/L borate (pK 9.2) titrated by 2-amino-2-methyl-1- propanol (pK 9.7). Under these conditions, the standard samples easily separated.

SEGUE QUARTA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

As per the authors, the best separation was achieved at high pH. The limit of quantitation (LOQ) of the method was 1.5 µmol/L. The advantages of this method include non-interference owing to other metabolites. Połeć-Pawlak studied the role of the counter-ion (indazolium and sodium ion) of ruthenium-based anticancer drugs (KP-1019 and KP-1039) in binding with human serum proteins (albumins and transferrin) using capillary electrophoresis hyphenated with inductively coupled plasma mass spectrometry (CE-ICP-MS). Further, it was used to specifically monitor changes in metal speciation following the formation of ruthenium–protein adducts. The authors observed that both the drugs showed similar activities in terms of statistically agreeable apparent rate constants, regardless of the nature of the protein. This indicated that changing the counter-ion did not affect the ability of given ruthenium-based drugs. Dömötör monitored the human serum albumin (HSA) displacement reaction of KP-1019 (competitive binding of warfarin and KP-1019 with HSA) by CE (20 mm phosphate buffer, 150 mm NaCl at wavelength 310 nm). The authors studied displacement reactions with the established site markers of warfarin, dansylglycine and bilirubin. Gallium maltolate (GaM), tris-(8-quinolinolato)gallium(III) (KP46) and some gallium(III) complexes, with hydrazones and thiosemicarbazones, are used as anticancer agents. Rudnev studied drug parameters like solubility (in water, saline and 0.5% dimethyl sulfoxide) and stability (against hydrolysis) using CE. In addition, the reactivities towards major serum transport proteins (albumin and transferrin) were evaluated to elucidate the drug distribution pathways. It was observed that tris-(8-quinolinolato)gallium(III) (KP46) and bis-(2-acetylpyridine-4,4-dimethyl-3-thiosemicarbazonato-N,N,S)gallium(III) tetrachlorogallate(III) (KP1089) bound to transferrin faster than albumin. It was implied that transferrin mediated the accumulation of gallium antineoplastic agents in solid tumors. The hydrolytic stability was also good and fairly stable, retaining integrity at a level of 50% for several hours. The gallium-based drug KP46 is known to bind with transferrin and HSA. Groessle studied binding of gallium nitrate formulation and KP-46 to transferrin and HSA by CE-MS. The authors concluded that both Ga and Fe(III) bound to transferrin and HSA, but Fe(III) bound more efficiently than Ga(III). Therefore, Ga(III) is replaced by Fe(III) from the binding pocket. Alkylating agents cause DNA damage-induced apoptosis, which are widely used in cancer chemotherapy. However, the efficiency of chemotherapeutic agents, which work by damaging DNA, is strongly reduced by DNA repair systems. Inhibition of nucleotide repair enzymes can be used in combination with anticancer alkylating agents to allow for dose reduction and improvement of therapeutic efficiency. Temozolomide (TMZ) induces aberrant alkylation of nucleotides and triggers rounds of mismatch repair, leading to cell death. Alkylation at the O6 position of guanine is particularly cytotoxic and mutagenic for cells, despite the fact that O6-methylguanine accounts for only a small fraction of all alkylation events. TMZ is spontaneously hydrolyzed at physiological

SEGUE SESTA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

pH to the active component 3-methyl-(triazen-1-yl)imidazole-4-carboxamide (MTIC). Kishida studied O6-methylguanine–DNA methy ltransferase enzyme (MGMT) promoter methylation in glioblastomas and its correlation with enzymatic activity using capillary electrophoresis (8-capillary system CEQ8000). CE approach used in this study provided higher resolution of the separated fragments, thereby yielding more objective and quantitative values. Quantitative methods provide better discrimination than classical gel-based MSP. Andrasi described analysis of TMZ and its degradants, MTIC and 5-amino-imidazole-4-carboxamide (AIC). Using short-end injection, the analysis of TMZ and its degradants could be performed within 1.2 min. The obtained precision (RSD) of migration times was better than 1.6%. The LOQ was 0.31–0.93 µg/mL. The study established that the half-life of the TMZ in vitro serum at room temperature was 33 min, close to the half-life (28 min) obtained in water at pH 7.9. As discussed above, Sha used a fluorescence resonance energy transfer–capillary electrophoresis (FRET-CE) system to quantify caspase-3 activation in HeLa cells by camptothecin, etoposide (cell cycle specific) and cisplatin (cell cycle non-specific) on caspase-3 activity. A decade earlier to this work, Soetebeer reported a sensitive and selective assay for the simultaneous determination of etoposide and etoposide phosphate in human plasma by capillary electrophoresis with UV laser-induced native fluorescence detection. It was the first method for the simultaneous quantification of etoposide and etoposide phosphate in plasma of a 12-year-old patient. For the reported method, coefficients of correlation were better than 0.998 (200 µg/L to 50 mg/L in plasma for etoposide and 100 µg/L to 20 mg/L for etoposide phosphate). The experiments were carried out using 150 mm tetraborate buffer adjusted to pH 10.6 with 150 mm NaOH. The limits of detection (LOD) were 100 and 30 µg/L for etoposide and etoposide phosphate, respectively. Thus, the sensitivity was improved about 30 times as compared with CE-UV. The limits of quantifications were 0.2 and 0.1 mg/L for etoposide and etoposide phosphate, respectively.

SEGUE SETTIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Urine

Urine is also an important source to identify anticancer drugs. Carter extracted tamoxifen and its metabolites from the urine of the patients suffering from both metastatic (stage IV) and locally confined (stages I–III) breast cancers. Analysis of these metabolites was performed by non-aqueous CE-ESI-MS. Peak heights from extracted ion current electropherograms of the metabolites were used to establish a metabolic profile for each patient. The authors demonstrated substantial variation among patient profiles, statistically significant differences in the amount of urinary tamoxifen N-oxide found in stages I–III compared with stage IV in breast cancer patients. Similarly, a statistically significant difference in the amount of 3,4-dihydroxytamoxifen was found in progressors as compared with nonprogressors with metastatic (stage IV) cancer. Theodorescu used CE-MS method to obtain polypeptide patterns from urine samples of 46 patients with urothelial carcinoma and 33 healthy volunteers. Fibrinopeptide A is an indirect measure of thrombin, which is associated with resistance to chemotherapy in patients of leukemia. Thus, the concentration of this factor might be an indicator of chemotherapeutic resistance in urothelial and gastric cancers. Several authors also reported an increase in the level of this molecule Bergen Ebert Chen investigated the metabolic profile of urine metabolites and elucidated their clinical significance in patients with colorectal cancer using CE-LIF. Sarcosine (an N-methyl derivative of glycine) is a potential marker of the prostate cancer and used to detect cancer in human urine sample. It has been observed that the concentration of this biomarker increased significantly as cancer progressed to metastasis. Soliman developed a simple, fast and effective CE-ESI-MS/MS method for the determination of sarcosine and other potential biomarkers in pooled urine. CE separation was performed on a positively charged, polyethyleneimine-coated capillary using 0.4–2% formic acid in 50% methanol. The method had the advantage of not requiring an extraction procedure for analyzing high-concentration urinary metabolites. Until 2012, only one work was available for the simultaneous determination of methotrexate, 6-TG and 5-fluorouracil (5-FU) by CE. The authors used 45 mm borate buffer as background electrolyte (BGE; pH 9.0). As per the authors, the method was the best for analysis of methotrexate (MTX) in human urine. Ifosfamide is a nitrogen mustard alkylating agent used in the treatment of cancer. Ifosfamide metabolizes in human body to give chloroacetaldehyde (CAA), which is a toxic compound with toxicities like neurotoxicity, nephrotoxicity, urotoxicity and cardiotoxicity. CAA interacts with cellular thiol groups leading to glutathione (GSH) depletion, cell death and generation of thiodiglycolic acid (TDGA), which can be detected in urine of patients. Samcová reported a sensitive CE method for the determination of TDGA in urine. As per the authors, the CE method could be used for the determination of 0.5–5 mg/L TDGA in 10-fold diluted urine samples without their pre-treatment.

SEGUE OTTAVA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Tissues

Tissues are the most common source for cancer detection. The metabolic fate of anticancer drugs is the best analyzed by the fast biological activities of cancer tissues. 5-FU is a nucleotide-based chemotherapeutic drug used for the treatment of a wide variety of solid tumors in various organs, including breast, pancreas, stomach, colon and rectum. Capillary electrophoresis is a sensitive technique, which has been shown to be capable of separating 5-FU and its metabolites and determining 5-FU in microdialysates obtained from breast cancer. Procházková used CE to analyze 5-FU and its active metabolite (5-dUMP) in cancer pancreatic cell line (PANC-1) as well as other plasma, culture medium and pancreatic tissues. Sun studied interaction between 5-FU and HSA by affinity capillary electrophoresis (ACE) at physiological pH. For the best separation, 67 mm phosphate buffer (pH 7.4), 293 K and 15 kV voltage were used. It was found that the migration time of 5-FU increased with an increase in concentration of HSA. Azacytidine is used for the treatment of myelodysplastic syndrome and the clinical management of acute myeloid leukemia. Azacytidine (5-azacytidine) and decitabine (2′-deoxy-5-azacytidine) are two closely related cytidine analogs with increasing clinical use for cancer therapy. Both compounds are prodrugs that metabolize to the corresponding nucleoside triphosphates before their incorporation into DNA or RNA. Brueckner reported increased therapeutic potential of azacytidine by esterification with a fatty acid (elaidic acid). The authors studied the demethylating activity of the esterified drug on different human cancer cell lines [human colon carcinoma cells (HCT116), breast cancer cell line (MCF-7) and human promyelocytic leukemia cells (HL-60)] using the same method. Caspases play important roles in cell apoptosis. Measurement of the dynamics of caspase activation in tumor cells not only facilitates understanding of the molecular mechanisms of apoptosis but also contributes to the development, screening and evaluation of anticancer drugs that target apoptotic pathways. Sha used FRET-CE to study the effects of camptothecin, cis-platin and etoposide on the activation of caspase-3 on human cancer cell line (HeLa-CD3). The authors used phosphate buffer (KH2PO4, Na2HPO4 · 12H2O; pH 8.0) as BGE. Haselberg used a CE-TOF-MS for the analysis of erlotinib-lysozyme conjugates in cellular materials. The BGE used for the study was 100 mm acetic acid (pH 3.1) containing 5% (v/v) isopropanol with separation voltage 30 kV. It was found that the CE-TOF-MS of each preparation showed narrow symmetrical peaks for the reaction products. Intercalating agents such as anthracyclines were isolated from a pigment-producing Streptomyces and used for more than 30 years. Until now, more than 200 naturally occurring anthracyclines have been identified. In addition, hundreds of derivatives have been synthesized, in order to overcome the high toxicity and the multi-drug resistance of anthracyclines. The most severe side effect of anthracyclines is a cumulative dose-related cardiotoxicity, commonly attributed to a radical damage to the cardiac tissue. Doxorubicin (DOX) is a widely used anthracycline proven to be effective against a variety of human malignancies, such as leukemia and breast cancer.

SEGUE NONA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Anderson used capillary electrophoresis with laser-induced fluorescence detection (CE-LIF) to separate and detect 12 metabolites of DOX metabolism (in NS-1 cells with 25 μm DOX for 8 h); using 10 mm borate and 10 mm sodium dodecyl sulfate (SDS; pH 9.4) as BGE. It was found that DOX accumulates in the different components of cells such as the nucleus, golgi apparatus and lysosome. It was also observed that all metabolites accumulated more in nuclei than the other organelles or cytosol. Paclitaxel (a diterpene amide) was initially isolated from the bark of the Pacific yew (Taxus brevifolia), and shows unique antitumor and antileukemic activities. It has been shown to produce responses in patients with different types of cancer, such as ovarian, breast, lung, head and neck region and malignant melanoma. Zhang developed a capillary electrophoresis hyphenated diode array detector (CE-DAD) method to evaluate the cytotoxicity of Na2SO3, MeHg, paclitaxel and CdCl2 on Caco-2 cells. The cytotoxicity evaluation and comparison were carried out using CE. The authors observed that paclitaxel was toxic at 20 μm on Caco-2 cells, while on SH-SY5Y cell its effective concentration was 10 μm. The CE method was fast, convenient and inexpensive with a wide range of applications. In addition, the method has the potential to ascertain primary toxic substances for comparison and drug screening.

SEGUE DECIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Saliva

Saliva is an important biological fluid carrying out various functions, including lubrication for speech, digestion of food and protection from microorganisms. It is produced by multiple salivary glands, particularly three major salivary glands, that is, parotid, sub-mandibular and sublingual, and several minor glands. Saliva comprises 99% water with minerals, mucus, electrolytes, nucleic acids and proteins such as enzymes, enzyme inhibitors, growth factors, cytokines, immunoglobulins and other glycoproteins. Saliva, originates from blood, reflects the physiological conditions of the body. Thus, it can be used to monitor clinical status and predict systemic diseases. Compared with blood, saliva offers distinct advantages for diagnostic or research purposes, like cost-effective, safe, easy and noninvasive collection. Indeed, many characteristics of body fluids, such as blood and urine, are applicable to saliva, including diurnal variation and the presence of diverse diagnostic analytes. Sugimoto studied saliva samples by CE-TOF-MS obtained from 215 patients (69 oral, 18 pancreatic and 30 breast cancer patients, 11 periodontal disease patients and 87 healthy controls). This was the first study to comprehensively analyze salivary metabolites and to identify metabolic profiles specific to oral, breast and pancreatic cancers. Of the metabolite profiles obtained, the annotated metabolites included carnitines (betaine, choline, carnitine, glycerophosphocholine), polyamines (cadaverine and putrescine), a purine (hypoxanthine), amino alcohols (ethanolamine), aliphatic and aromatic amine (trimethylamine) and other amino acids, in accordance with the defined chemical class category in the Human Metabolome Database (HMD). This suggests that cancer-specific signatures are embedded in saliva metabolites.

SEGUE UNDICESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Miscellaneous

In addition to the above discussed biological samples, anticancer drugs have been analyzed in other biological sources such as CSF in humans and different parts of plants. Cheng reported triple-stacking capillary electrophoresis for analysis of methotrexate and its eight different metabolites in cerebrospinal fluid. The buffer system chosen was 60 mmphosphate, pH 3.0 containing 15% Tetrahydrofuran (THF) and 100 mm SDS. The authors observed that the value of RSD was <11% with linearity in the range of 0.5–7 μm, indicating good precision of the method. Truus analyzed the composition of varieties of Taxusplants for taxol by CE-DAD. Shakalisava and Regan reported the separation and identification of a mixture of taxane and anthracyline. In particular, a mixture of doxil and doxorubicin was studied within a migration time of more than 6 min. The authors used 20 mmborate buffer (pH 9.0) as BGE with 70% acetonitrile (ACN). Camptothecin and its derivatives, such as indolocarbazoles, indenoisoquinolines etoposide and adriamycin, are specifically targeted to topoisomerase. These anticancer agents bind to a transient topoisomerase-I–DNA covalent complex and inhibit the resealing of a single-strand nick. Camptothecin (CPT) is a cytotoxic quinoline alkaloid, which inhibits DNA enzyme topoisomerase I. The identification and quantitation of CPT in the plant extracts and in-vitro cell culture extracts are fundamental to assessing CPT contents and its biosynthetic potential in plants. Huang determined camptothecin by CE-DAD from Fructus Camptothecae Acuminatae. The determination was carried out with a running buffer composed of 30 mm/L Na2B4O7 (pH 9.0), separation voltage 15 kV with a detection limit of 0.5 mg/L. Huang developed a rapid, sensitive and selective microchip electrophoresis coupled with chemiluminescence (MCE-CL) system for the simultaneous quantification of 6-MP and 6-TG in human plasma. The electrophoretic electrolyte used was 20 mm borate buffer (pH 9.6) containing 18 mm SDS with a mixture of 40 mm sodium carbonate buffer (pH 12.0) and 1 mmNaBrO as oxidizer solution. The linearity ranges were from 2.0 × 10−8 to 4.0 × 10−6 m for all thiol drugs. It was observed that the detection limits were good for both these analytes (8.9 nmfor 6-MP and 10 nm for 6-TG).

SEGUE DODICESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Nonbiological samples

The role of CE is also very important in quality control of anticancer drugs. CE has been used to quantify some anticancer drugs in pharmaceutical preparations. In addition, this technique has also been exploited to determine few anticancer drugs in environmental matrices.

Pharmaceutical preparations

Thousands of samples are analyzed in pharmaceutical industries during the development of anticancer drugs. The analysis of these drugs along with impurities is an important aspect of quality control and effective drugs. Nguyen and Murray developed a CE-ICP-MS method for the separation of free oxaliplatin drug from liposome-entrapped oxaliplatin. This simultaneous determination of both phosphorous and platinum was useful for monitoring liposomes (phospholipids) and platinum-based drugs in biological samples. CE with HEPES [4-(2-hydroxyethyl)-1-piperazineethanesulfonic acid] 10 mm, containing 5 mm NaCl buffer (pH 7.5) provided better liposomal peak shape. In order to improve the peak shape, a small amount of SDS (1–5 mm) was added to the buffer solution. Furthermore, addition of 1 mm of SDS resulted in sharper liposome peaks. A detection limit of 29 ng/mL Pt+ was obtained with 2.9% precision. The total concentrations of free and encapsulated oxaliplatin were determined. A PEGylated liposomal formulation of oxaliplatin was used as a model drug formulation. It was reported that CE-ICP-MS was an efficient characterization method for the development and quality control. Kim and Wainer reported CE-LIF method for detection of DOX and liposome-encapsulated DOX. The separation was carried out using potassium phosphate buffer (12.5 mm, pH 7.4) as the running buffer. The limit of detection for DOX was 0.1 µg/mL. The validated method was successfully used to quantify DOX in human plasma using a direct injection of a 4-fold dilution of spiked liposomal DOX. The authors carried out an accuracy check at four different concentrations (0.1, 0.5, 5.0 and 100 µg/mL) and found that the maximum observed relative error was < 4%. The results of the above work were impressive as CE-LIF can be used to separate and measure free and liposomal encapsulated DOX in buffer and plasma. Chen synthesized ligands based on alkaloid oxoglaucine (OG) and their transition metal complexes of Au(III), Zn(II), Ni(II) and Mn(II). The metal complexes were characterized using CE with running PBS buffer (10 mm, pH 7.0, containing 30% methanol). It was observed that the ligand (OG) had different migration times than Zn-complex. Catharanthine and vindoline are indole alkaloids and are used as precursors of anticancer drugs. The alkaloids, such as vinorelbine, vincristine and vinblastine, are produced by the coupling reaction of catharanthine and vindoline. These alkaloids are recommended in the treatment of breast and advanced human nonsmall-cell lung cancers. There are many CE methods for analyses of alkaloids in plants and laboratory samples. These analyses are beyond the scope of this article as they concern anticancer drugs in biological and pharmaceutical preparations. Some reviews are available on CE analyses of alkaloids.

SEGUE TREDICESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Environmental samples

The presence of anticancer drugs in the environment is an indication of cancer disease in that area. CE is also a useful technique to monitor anticancer drugs in environmental matrices. Mahnik determined 5-FU in Vienna University Hospital effluents, considered a potential cytotoxic environmental pollutant. The authors detected 5-FU using 80% (160 mm) sodium borate buffer (pH 9.5) and 20% ACN (v/v) as BGE. The method was applicable within the range 5–500 µg/L. The same group used SPE-CE for analysis of 5-FU in the water effluents of the same hospital. The authors monitored this drug for 98 days in 2 years. During this period, it was found that the concentration range of 5-FU was <8.6–124 µg/L.

Analyses in genomic samples

Several modifications at the genetic level lead to cancer. Of these, DNA modification is the most common one. Hence, diagnosis at genomic level will provide more clues about the cancer and help in the development of more effective anticancer drugs. CE may be one of the most important and helpful methods. For example, gemcitabine is one of the most widely used anticancer drugs, especially against solid tumors. Gemcitabine is metabolized to gemcitabine diphosphate and triphosphate, which are incorporated into DNA, resulting in inhibition of DNA polymerase activity. Schäfer analyzed the effect of gemcitabine on DNA demethylation. The authors used CE to determine the level of genomic DNA methylation, as per the protocol reported by Stach. This protocol can be used for quantification of 5-methylcytosine in genomic DNA. Stach reported an accurate method to quantify DNA methylation levels by CE-LIF. The reported method was sensitive, requiring a small amount (<1 µg, reproducible up to 100 ng) of DNA. The working BGE was 75 mm SDS in 17 mm sodium phosphate buffer (pH 9.0), containing 15% (v/v) methanol. The authors used this method for 81 chronic lymphocytic leukemia patients to determine the levels of methylation in cytosine. Nguyen used an automated capillary DNA sequencer with laser-induced fluorescence detection. More specifically, the authors studied the electrophoretic mobility of 3′-termini (3′-hydrogen, 3′-hydroxyl, 3′-phosphate, and 3′-phosphoglycolate).

SEGUE QUATTORDICESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

The order of mobility was found to be 3′-hydroxyl < 3′-hydrogen < 3′-phosphate < 3′-phosphoglycolate. The reported method had several advantages, such as analysis of long DNA sequences, high reproducibility, precision and sensitivity. Krylovaet studied DNA aptamers as potential inhibitors of AlkB-catalyzed DNA demethylation. The mechanism of inhibition was studied by CE-LIF. Chen studied the degree of DNA methylation by combined bisulfite restriction analysis and CE-LIF methods. It was the first study of CE-LIF method to detect differences in DNA methylation in different cancerous cell lines. The authors reported 30 pg as the detection limit of DNA. Nucleosides, in urine, are an important class of metabolites with potential to serve as tumor markers. Modified nucleosides are considered to be indicators for the whole-body turnover of RNAs. Modified nucleosides are excreted in abnormal amounts in the urine of cancer patients. Several studies have shown a positive relationship between nucleoside levels and cancer stages. Methylated purine, pyrimidine and other modified nucleosides have been shown to be excreted in abnormal amounts in urine of patients with cancer. Elevated concentrations have been suggested as possible markers of different forms and stages cancers. It was concluded that the levels of nucleosides in urine from cancer patients were, generally, elevated, and that the increase in modified nucleosides was more pronounced than that of normal nucleosides. In addition, the concentrations of pseudouridine (pseu), 1-methylinosine (m1I), N4-acetylcytidine (ac4C), 1-methylguanosine (m1G) and 2-methyl guanosine (m2G) in urine of cancer patients were elevated significantly. Thus, all these nucleosides can be chosen as biomarkers. Around 15 years ago, Liebich reported a CE method for the separation of 15 urinary normal and modified nucleosides from cancer patients. The authors optimized CE separation by controlling voltage and the concentration of SDS in BGE. In 1965, Barnett Rosenberg discovered the biological activity of platinum complex during the investigation of the effect of electric field on the growth of bacteria. Several platinum-based drugs have been reported and successfully launched onto the market. The study of active metabolites is also an important issue for understanding the mechanism of action of a drug. It is well known that platinum complex interlinks with the DNA strand; hence, an ideal analytical method is required. Warnke reported a method describing the analysis of adduct formation between platinum complex and DNA using CE-MS technique. The authors, reacted a mixture of dAMP and dGMP with cis-platin and studied the reaction mixture after 7 h. Several modified complexes (mono- and bi-functional adducts) were observed by CE-MS analyses. All modified as well as unmodified nucleotides were separated within 11 min. The separation buffer was 32 mm ammonium acetate buffer (pH 9.6). The migration order of unmodified nucleotides was dAMP > dCMP > dTMP > dGMP. Bleomycin is a glycopeptide antibiotic widely used as an antitumour drug . The antitumor mechanism of bleomycin is thought to involve DNA-strand breaks preferentially cleaving DNA at 5′-GT and 5′-GC dinucleotides. Bleomycin cleavage generates a strand DNA break with 3′-phosphoglycolate and 5′-phosphate ends. In order to analyze such fragments, before the advent of automated DNA sequencer, manual methods (radiolabeling, slab gel electrophoresis) for detection were used. These traditional methods were difficult and tedious both in the terms of labor and safety. Kao developed a CE-LIF method for the detection of concentrations of amino acids and biogenic amines. The method was validated by determining the amino acids in lysates of a cancerous cell line (breast cancer cells MCF-7) and normal human epithelial cells (H184B5F5/M10). The authors reported separation of 11 amino acids and three biogenic amines within 16 min, with LODs ranging from 2.06 to 19.17 nm, with 0.52% RSD of mobilities.


SEGUE QUINDICESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Optimization of CE analyses

Separation by CE is very sensitive and is controlled by a number of parameters. The optimization factors may be categorized into two categories, that is, independent and dependent. The independent parameters are under the direct control of the operator, including the choice of the buffer, pH of the buffer, ionic strength of the buffer, type of chiral selectors, voltage applied, temperature of the capillary, dimension of the capillary, BGE additives and some other parameters. On the other hand, the dependent parameters are directly affected by the independent parameters and not under the direct control of the operator. These parameters are field strength (V/m), electroosmotic flow (EOF), Joule's heating, BGE viscosity, sample diffusion, sample mobility, sample charge, sample size and shape, sample interaction with capillary and BGE molar absorptivity etc. Therefore, CE optimization can be controlled varying all these parameters. The optimization of CE conditions in analysis of anticancer drugs is summarized in the following sub-sections.

Effect of buffer composition

Normally, the migration time increases at high concentration of buffer. Flores reported the optimization of MTX and 6-TG using different concentrations of phosphate buffer. At low buffer concentrations (10–30 mm), the migration time was less with poor resolution between 6-TG and MTX. Hence, 45 mm buffer concentration was taken as the optimum one. Sometimes, the solubility of analytes becomes problem during the analysis of analytes. In such cases, an organic solvent may be used to dissolve the insoluble analytes in aqueous buffer. For example, Shakalisava and Regan used 70% ACN with buffer to dissolve docetaxel. Similarly, Whitaker observed that an addition of ACN to BGE improved the stability of CE separation. The incorporation of ACN into BGE had been reported to reduce the interactions of anthracyclines with the capillary wall . The organic modifier also served to increase both the solubility and stability of the anthracyclines. Jiang used tris (20–50 mm) running buffer with surfactant (Tween 20, 0.01–0.13% v/v) for the analysis of Ab*–AuNPs and CEA-Ab*–AuNPs complex. The authors observed that the separation first increased followed by a decrease at high concentrations of tris buffer. On the other hand, the resolution reached its maximum at increasing concentration of surfactant (at 0.067% v/v). On further increasing the concentration, no change in resolution was observed. Thus, 0.067% surfactant and 35 mm tris were used for further experiment. Sohajda studied CE analysis of three isomer of vinca alkaloids (vincamine, vinpocetine and vincadifformine) using an acidic buffer system (15 mm NaOH, pH 2.5–H3PO4). The authors optimized the separation condition of vincadifformine by selecting different chiral selector (α-,β- and γ-CDs) in varying amounts (0–75 mm). It was observed that only one peak appeared in the absence of chiral selector in BGE, with low migration time. However, with an increase in concentration of selectors, better resolution was observed . It is important to mention here that the migration order of enantiomers is highly dependent on the type of chiral selector used.

SEGUE SEDICESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

For example, the (+)-enantiomer migrated first followed by (−)-enantiomers of vincadifformine using HP-β-CDs and ME-β-CDs [(2-hydroxy)propyl-β-cyclodextrins and methylated-β-cyclodextrins]. The order of migration reversed with HP-γ-CDs and ME-γ-CDs [(2-hydroxy)propyl-γ-cyclodextrins and methylated-γ-cyclodextrins] owing to stronger complexation of the former than the latter. Chen reported the simultaneous determination of vinblastine and its monomeric precursors (vindoline and catharanthine) by CE-MS in Catharanthus roseus. The authors studied various BGEs to achieve the best separation. They used acidic buffer (ammonium acetate and acetic acid) owing to the basic nature of alkaloids. Furthermore, the authors studied the effects of ammonium acetate and acetic acid concentrations on the separation, resolution and migration time. It was observed by the authors that, as the concentration of ammonium acetate increased, the migration time of all three components also increased. However, at higher concentrations, broadening of peaks was also observed. At the same time, migration time increased up to 2.5% concentration of AcOH followed by a decrease in migration time; 20 mm NH4OAc and 1.5% AcOH gave the best resolution and was selected as the optimum concentration. Barthe reported a rapid method for the determination of vinca alkaloids by NACE-DAD. The author studied the effect of different ratios of MeOH–ACN (ACN ratio in MeOH ranged from 0 to 90.0) on the electrophoretic mobility of vinca alkaloids. The variation of methanol–acetonitrile ratio affected electrophoretic mobilities. The difference of mobilities was the most marked between 10 and 40% ACN in MeOH.

SEGUE DICIASSETTESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Effect of pH

The pH of running buffer has an important role on the capillary wall surfaces characteristics and the effective electric charge of the analytes. Flores studied the effect of pH on separation of 5-FU, MTX and 6-TG. Phosphate buffer (20 mm, pH 2.5–12) and borate buffer (20 mm, pH 8–10) were tested for this purpose. It was observed that, in acidic medium (pH 3.0), the migration velocities of 6-TG and MTX were less than EOF, while 5-FU co-migrated with it. These drugs had migration velocity less than EOF in basic medium (pH 8.0). Thus, the best separation was achieved with borate buffer at pH 9.2 with order of migration as MTX > 6-TG > 5-FU. It was observed that 5-FU (exist in anionic form at higher pH) had a higher migration time. Whitakeret studied the separation of anthracyclines (doxorubicin, daunorubicin and epirubicin) in the range of pH 8–10. The best separation of DOX and epirubicin (EPI) was observed at pH 9.0. This might be due to higher complexation with borate buffer at 10.0 pH than 8.0. In comparison of pH 9.0 and 10.0, complexation at pH 10.0 was greater, which led to longer migration time. Considering this fact, Jiang carried out CE of antigen and its complex with gold nanparticles in alkaline media. The authors observed that resolution for separation increased with an increase in running buffer pH (8.5–9.5). On further increasing pH (9.5–10.5), the resolution decreased. Barthe studied the electrophoretic mobility of vinca alkaloids by varying the pH of BGE. The authors used different amounts of AcOH (0.2–1.8m) in 50 mm ammonium acetate and ACN–MeOH (25:75, v/v) mixtures. At comparatively high pH (low amount of AcOH; 0.2 m), the electrophoretic mobility of vinflunine was low, while the remaining four alkaloids (vindoline, anhydrovinblastine, catharanthine and vinorelbine) had high mobility. The authors observed better mobility and resolution at 0.6 m AcOH. The mobility increased by lowering pH and, hence, 0.6 m AcOH was selected as the optimum one.

SEGUE DICIOTTESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Effect of temperature

The viscosity of BGE is directly proportional to the working temperature, affecting the migration time of the analytes. In addition, the reproducibility of CE is temperature-dependent. Therefore, temperature is a crucial parameter to control CE analysis. Flores studied the effect of temperature between 25 and 40°C on the analyses of 5-FU, MTX and 6-TG. It was observed that the migration times were 12 min for 5-FU and ~10 min for MTX and 6-TG at low temperature (25°C). The effect of temperature was almost linear with the migration time and, hence, 35 °C was selected as the optimal temperature. In some cases where a CE-MS coupled system was used, temperature could not be optimized owing to the exposure of the capillary to air in the CE-MS machine. Barthe studied the effect of temperature (10–30°C) on the resolution of vinca alkaloids. The authors reported that, with an increase in temperature, migration time decreased, but no major change in separation was observed. In such a case, 20°C was chosen as experimental temperature, resulting in good resolution of anhydrovinblastine and vinflunine. Wang developed a CE genotyping method of endothelial growth factor receptor (EGFR), which is part of lung cancer therapy. The authors analyzed tumors tissues from 50 cancer patients and observed two frequent mutations associated with therapeutic efficiency of lung cancer (deletion of exon 19(8/50) and L858 R point mutation in exon 21 (12/50). The separation was tested at 25, 30 and 35°C to evaluate the resolution. It was observed by the authors that these two mutations were successfully resolved at all investigated temperatures. However, low temperature results in longer separation time, hence, 35°C was selected as the optimum temperature.

SEGUE DICIANNOVESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Effect of voltage

Generally, high voltage reduces the analysis time in CE. Barthe studied the effect of voltage (10 and 30 kV) on the separation of vinca alkaloids. The authors observed that an increase in the applied voltage increased the separation efficiency up to a certain limit. A voltage higher than 20 kV affected resolution, particularly for the anhydrovinblastine/vinflunine pair, with peak broadening occurring as a consequence of the diffusion phenomenon with Joule's effect. However, a constant voltage of 25 kV was selected for a complete separation of all compounds in a short time with an acceptable electric current (~70 μA). Wang studied the mutated DNA fragments in samples obtained from 20 gastric patients. The authors examined the effect of different separation voltages (10, 13, 16, 20 and 25 kV) on separation efficiency. The authors reported different separation resolutions for different close-sized fragment pair groups of DNA. Furthermore, the authors observed that separation efficiency of small DNA molecules increased with increased voltage, while for larger DNA molecules, high voltage played an adverse role. The baseline separation of a mixture of 51 + 57, 57 + 64, 458 + 502 and 502 + 540 bp was achieved at both 10 and 13kV. DNA molecules migrated faster at higher voltage and, hence, 13 kV was selected as the optimal separation voltage by the workers. Two-point mutation in genome, studied by Wang, was also affected by varying the separation voltages. The authors studied the effect of voltage at 6, 8 and 10 kV. Higher voltage provided better results in CE separation, and, hence, 10 kV was chosen as optimized voltage. Chen varied the voltage from 10 to 25 kV for optimized analyses of vinblastine and its precursors (vindoline, catharanthine). The analysis time was short with acceptable resolution using these voltages. The authors noted that the best resolution was achieved when the applied voltage was 10 kV, but the migration time was very long (>25 min). Contrarily, shortest analysis time was observed by applying a voltage of 25 kV but resulted in poor resolution.

SEGUE VENTESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Effect of amount loaded

The amount of sample injected is also an important aspect of achieving the optimum separation in CE. Little work has been carried out by exploiting this parameter. However, the attempts of some workers on this issue are discussed. Kim and Wainer studied the effect of dilution of the sample on the separation of DOX and lipososmal DOX drugs. The authors reported poor resolution at high concentration but good resolution was achieved by diluting samples four times. The efficiency and reproducibility of the separation were good for diluted samples. Flores assessed the linearity of the CE method by injecting diluted urine solutions spiked with three drugs (TG, MET and FU) at concentrations ranging from 3 to 25 mg/L. The results obtained were acceptable for biological applications for all three drugs analyzed. Good reproducibility was observed for three antineoplastic drugs. Yang studied the effects of injection time on peak height. The authors observed the dependency peak height (5-FU and TF) on injection time. The satisfactory separation was achieved at low injection time. Huang used CE-CL for analysis of captopril in spiked human plasma. It was observed that the intensity of captopril peak was high in the spiked sample owing to the high loading amount. Polyamines are one of the important biomarkers for cancer detection in patients. Liu reported a CE-ECL method for the detection of polyamine in human urine sample. The authors observed that ECL peak intensities increased linearly with the increase of injection time from 2 to 10 s. An improved resolution was obtained at lower injection volume. Cathepsin-D is a breast cancer biomarker and is used to detect cancer stages. Shihabi and Kute analysed cathepsin D from different tissues. The authors reported that, on increasing the loading amount of sample from 2.5 to12.5%, sensitivity of detection increased six times.

Future perspectives

From the above discussion, it is clear that CE methods are important techniques for anticancer drug development. The applications of CE in drug development include quality control in industry, analyses of drugs in biological samples and understanding of mechanism of action. However, it was observed that CE is not a fully developed technique so far. The reproducibility and sensitivity are the major drawbacks of this technique. To improve the reproducibility and sensitivity of CE method, several modifications have been made. Microchip-based nano-CE methods have been developed to overcome these limitations. Still there is a great need to incorporate thermostat chambers for silica capillary in CE machines. In addition, advances in CE include understanding of interaction of drugs at a genomic level, degradation products of genomic materials, study of interaction of drugs with serum proteins, separation of highly complex natural products and the fate of anticancer drugs in the environment. This will increase the reproducibility and sensitivity of the instruments. Additionally, the high cost of CE machines is one of the major obstacles in its routine laboratory practice, which should be taken into account by manufacturers.

SEGUE VENTUNESIMA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Conclusion

Cancer is becoming one of the most deadly diseases worldwide. That is why scientists are working on the development of novel anticancer drugs. During the drug development, time and economy are the most important factors, which should be kept in consideration. The application of CE can reduce these factors considerably. This article highlights the role of CE in anticancer drug development. Among several analytical procedures employed, hyphenated CE is the method of choice owing to its fast and inexpensive nature with low consumption of costly solvents, energy and labor. A range of detectors (UV, LIF, ESI-MS, capacitively coupled contactless conductivity detection, DAD) can be hyphenated with CE, making drug development easy. Moreover, it can be easily used to understand the pharmacodynamics and pharmacokinetics of a drug. Limits of detection and quantification can be achieved up to picogram level. Drugs from both synthetic and natural sources can be easily studied. The method has been used from academia to industries and hospitals. Briefly, CE is becoming popular for anticancer drug development. We hope that in future, CE will be the choice of researchers, academicians and oncologists to analyze anticancer drugs.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Bionic Olympics to be hosted in 2016



The first Cybathlon, an Olympics for bionic athletes, will take place in Switzerland in October 2016.

The event will include a race where competitors control an avatar via a brain interface.

There will also be races for competitors wearing prosthetic limbs and exo-skeletons.

Hosted by the Swiss National Competence Center of Research, it is hoped the competition will spur interest in human performance-enhancing technology.

The brain-computer interface race is designed for competitors who are paralysed from the neck down. They will control an avatar in a computer racing game via a headset that connects the brain to a computer.

There will also be races for those wearing arm or leg prosthetics, an exoskeleton race and a wheelchair race.

The assistive devices worn by the athletes, who will be known as pilots, can either be ones that are already commercially available or prototypes from research labs.

There will be two medals for each competition, one for the pilot and one for company that developed the device.

Bionic limbs and exoskeletons are becoming much more technically advanced, offering those wearing them much more realistic movements.

Prof Hugh Herr, from the Massachusetts Institute of Technology, showed off some of the prosthetics that his team have been working on at the Ted (Technology, Entertainment and Design) conference in Vancouver last week.

He is currently in negotiations with health care professionals to get the bionic limbs more widely available to those who need them.

Often though there was a disconnect between technology and patients, said Prof Robert Riener, event organiser, from the University of Switzerland.

"The idea is that we want to push development of assistive technologies towards devices that patients can really use in everyday life," he told the BBC.

"Some of the current technologies look very fancy but are a long way from being practical and user-friendly," he added.

The other main aim of the games is to allow people to compete who have never had the opportunity before.

"We allow technology that has previously been excluded from the Paralympics. By making it a public event we want to get rid of the borders between patients, society and the technology community," Prof Riener said.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Hybrid power plants can help industry go green: Affordable solar option for power plants



Hbrid cars, powered by a mixture of gas and electricity, have become a practical way to "go green" on the roads. Now researchers at Tel Aviv University are applying the term "hybrid" to power plants as well.

Most power plants, explains Prof. Avi Kribus of TAU's School of Mechanical Engineering and its innovative new Renewable Energy Center, create power using fuel. And solar thermal power plants -- which use high temperatures and pressure generated by sunlight to produce turbine movement -- are currently the industry's environmentally-friendly alternative. But it's an expensive option, especially when it comes to equipment made from expensive metals and the solar high-accuracy concentrator technology used to harvest solar energy.

Now, a new technology Prof. Kribus has developed combines the use of conventional fuel with the lower pressures and temperatures of steam produced by solar power, allowing plants to be hybrid, replacing 25 to 50 percent of their fuel use with green energy. His method, which will be reported in a future issue of the Solar Energy Journal, presents a potentially cost-effective and realistic way to integrate solar technology into today's power plants.

Taking down the temperature for savings

In a solar thermal power plant, sunlight is harvested to create hot high-pressure steam, approximately 400 to 500 degrees centigrade. This solar-produced steam is then used to rotate the turbines that generate electricity.

Though the environmental benefits over traditional power plants are undeniable, Prof. Kribus cautions that it is somewhat unrealistic economically for the current industry. "It's complex solar technology," he explains. The materials alone, which include pipes made from expensive metals designed to handle high pressures and temperatures, as well as fields of large mirrors needed to harvest and concentrate enough light, make the venture too costly to be widely implemented.

Instead, with his graduate student Maya Livshits, Prof. Kribus is developing an alternative technology, called a steam-injection gas turbine. "We combine a gas turbine, which works on hot air and not steam, and inject the solar-produced steam into the process," he explains. "We still need to burn fuel to heat the air, but we add steam from low-temperature solar energy, approximately 200 degrees centigrade." This hybrid cycle is not only highly efficient in terms of energy production, but the lowered pressure and heat requirements allow the solar part of the technology to use more cost-effective materials, such as common metals and low-cost solar collectors.

A bridge to green energy

The hybrid fuel and solar power system may not be entirely green, says Prof. Kribus, but it does offer a more realistic option for the short and medium term. Electricity from solar thermal power plants currently costs twice as much as electricity from traditional power plants, he notes. If this doesn't change, the technology may never be widely adopted. The researchers hope that a hybrid plant will have a comparable cost to a fuel-based power plant, making the option of replacing a large fraction of fuel with solar energy competitive and viable.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Selezionare la luce secondo la direzione, per tutte le lunghezze d'onda

Un dispositivo in grado di filtrare la luce proveniente da una specifica direzione è stato realizzato da un gruppo di ricercatori del MIT impilando numerosi strati ultrasottili di materiali diversi. Il risultato potrebbe trovare numerose applicazioni tecnologiche nel campo della produzione energetica, in particolare del solare termofotovoltaico, ma anche nel campo dei rivelatori elettronici e nella costruzione dell'ottica dei telescopi.

Promette di generare importanti ricadute nel campo dell'energia fotovoltaica, dei rivelatori elettronici e dei telescopi il nuovo dispositivo ottico descritto in un articolo pubblicato dalla rivista “Science” da un gruppo di ricercatori del MIT guidati da Yichen Shen. Si tratta di uno speciale filtro che lascia passare la luce che proviene solo da una specifica direzione, mentre riflette tutta quella che proviene da qualunque altra. Il fenomeno avviene per tutte le lunghezze d'onda, rappresentando un notevole progresso rispetto ai dispositivi realizzati finora.

La luce è un'onda elettromagnetica, cioè un'oscillazione del campo elettromagnetico che si propaga nello spazio. Come qualunque altra onda è caratterizzata da una lunghezza d'onda, che nel caso della luce visibile è compresa tra i 380 e i 760 nanometri. Ma la luce è caratterizzata anche da un altro parametro che non trova corrispondenza nelle onde di materia: la direzione di polarizzazione, definita come la direzione dello spazio lungo la quale avviene l'oscillazione del campo elettrico associato alla luce.

Con opportuni materiali, è possibile produrre filtri che lasciano passare soltanto la luce con una direzione di polarizzazione ben definita, detti filtri polarizzatori, usati per esempio anche in lenti per occhiali da Sole. Altri filtri sono concepiti in modo da lasciar passare solo specifiche lunghezze d'onda e non altre. Per molte applicazioni, sarebbe utile disporre di un filtro che facesse passare solo la luce proveniente da una particolare direzione e su tutte le linghezze d'onda dello spettro visibile o quasi. L'obiettivo è stato ora raggiunto dai ricercatori del MIT grazie a un dispositivo costituito da una pila di strati di due materiali ultrasottili alternati, il cui spessore è stato controllato con precisione.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SECONDA PARTE

Il nuovo filtro realizzato dai ricercatori del MIT: la luce può passare solo se incide sul filtro con un particolare angolo (raggio di colore bianco). Per tutti gli altri angoli d'incidenza (raggio rosso) la luce viene riflessa (cortesia Weishun Xu e Yuhao Zhang)“Quando ci sono due materiali accoppiati, lungo la loro interfaccia si verificano fenomeni di riflessione”, spiega Marin Soljacic che ha partecipato alla ricerca. "In queste interfacce c'è però un 'angolo magico' chiamato angolo di Brewster in cui la riflessione non si verifica affatto”.

Poiché la quantità di luce riflessa in ciascuno di questi intervalli è piccola, combinando diversi strati con le stesse proprietà è possibile produrre un materiale in cui la maggior parte della luce viene riflessa, fatta eccezione per una particolare direzione d'incidenza e una particolare direzione di polarizzazione. Con circa 80 strati alternati è possibile riflettere la luce proveniente dalla maggior parte degli angoli su un'ampia gamma di lunghezze d'onda, cioè quasi l'intero spettro visibile.

Secondo gli autori, il risultato apre le porte a un gran numero di applicazioni interessanti nel campo della produzione energetica, in particolare del solare termofotovoltaico, una tecnologia in cui l'energia solare viene sfruttata per riscaldare un materiale che a sua volta emette radiazione luminosa di una particolare lunghezza d'onda. Quest'ultima può essere usata per alimentare una cella fotovoltaica ottimizzata per produrre elettricità quando viene illuminata da luce di una particolare lunghezza d'onda. Per questo approccio, è essenziale limitare le perdite di luce e calore dovute ai fenomeni di riflessione e, a questo scopo, il materiale realizzato dai ricercatori del MIT potrebbe fare una grossa differenza rispetto a altri dispositivi convenzionali.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

La bioraffineria di terza generazione per una chimica più verde



L’ENEA è tra i partner del progetto BIT3G – “Bioraffineria di terza generazione” per lo sviluppo di processi di produzione di biocarburanti a basso impatto ambientale. Novamont è il coordinatore del progetto, e gli altri partner sono: CNR, CRA, Università degli Studi di Perugia, Agrinewtech, Filarete Servizi e Matrica.

L’obiettivo è di realizzare una bioraffineria che sia integrata nel territorio, che partendo dall’identificazione delle aree che non sono di interesse agricolo e dallo studio delle colture no-food (es. le aridocolture), e rispettando la biodiversità locale, consenta di utilizzare la biomassa per ottenere prodotti di altro valore aggiunto attraverso processi tecnologici sostenibili.

Le attività dell’ENEA riguardano gli aspetti tecnologici del pretrattamento della materia prima ligno-cellulosica, lo sviluppo e l’ottimizzazione dei processi di produzione di prodotti ottenuti da biomasse.

Finanziato dal Ministero dell’Istruzione, dell’Università e della Ricerca (MIUR), il progetto BIT3G rappresenta uno dei quattro progetti strategici di ricerca e sviluppo compresi nel Cluster Tecnologico Nazionale della Chimica Verde, che si pone l’obiettivo di rilanciare la chimica italiana sotto il segno della sostenibilità ambientale, sociale ed economica, stimolando la ricerca e gli investimenti in nuove tecnologie.

I cluster tecnologici rappresentano un modello di aggregazione ad alto livello di internazionalizzazione tra imprese ed organismi pubblici di ricerca.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

ENEA è partner del progetto PELGRIMM per lo sviluppo di combustibili per il nucleare di IV generazione



PELGRIMM - “ PELlets versus GRanulates: Irradiation, Manufacturing & Modeling”, è un progetto di ricerca cooperativo finanziato dal Settimo Programma Quadro della Commissione Europea, finalizzato a sviluppare combustibili nucleari contenenti attinidi minori per reattori a neutroni veloci di IV Generazione refrigerati a sodio.

I reattori di IV Generazione sono concepiti per un utilizzo ottimale delle risorse di uranio, per la minimizzazione della radiotossicità a lungo termine mediante trasmutazione degli attinidi minori, per la resistenza alla proliferazione, per l’alta protezione fisica, per il miglioramento della sicurezza e dell’affidabilità e per la competitività economica. Ai reattori di IV Generazione viene dedicato un notevole impegno internazionale di ricerca teorica, sperimentale e industriale, in particolare nei campi del combustibile e dei materiali strutturali. L’ENEA contribuisce per l’Italia al conseguimento di questi obiettivi.

PELGRIMM è coordinato da CEA (Francia), con la partecipazione di dodici organizzazioni europee chiave per questo tipo di ricerche: laboratori nazionali (CEA, ENEA, KIT, NRG, PSI, SCK-CEN), laboratori internazionali (JRC-ITU, JRC-IE), Università (KTH), industrie nucleari (AREVA, EDF), una organizzazione internazionale per la formazione e l’addestramento (ENEN), una società di consulenza in management innovativo (LGI Consulting).

Le attività ENEA riguardano: la modellazione e la simulazione del comportamento del combustibile sottoposto a irraggiamento; le attività di valutazione preliminare di sicurezza per gli aspetti neutronici e termo-idraulici.

Incorporare attinidi minori nel combustibile a ossido di uranio o ad ossidi misti di uranio e plutonio per i futuri reattori significa ricavarne energia ed ottenere prodotti che decadono in tempi molto più brevi, quindi ridurre le quantità di rifiuti radioattivi di alto livello. Inoltre, i processi di produzione del combustibile in forma granulare comportano la gestione di rischi inferiori rispetto alla produzione di pellet tradizionali.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Batteri, biodeterioramento e biorestauro: nuove sfide per la salvaguardia dei beni culturali



Le nuove frontiere della microbiologia nel settore dei beni culturali e la creazione di un mercato del biorestauro sono stati al centro della conferenza “Batteri, biodeterioramento e biorestauro: nuove sfide”, che si è tenuta oggi a Roma presso il Museo Nazionale Etrusco di Villa Giulia.

All’appuntamento, organizzato dagli Amici delle tombe dipinte di Tarquinia, hanno preso parte le ricercatrici dell’ENEA Anna Rosa Sprocati e Chiara Alisi che hanno fatto il punto sulle ricerche condotte presso il gruppo di Microbiologia Ambientale e Biotecnologie Microbiche dell’Unità tecnica UTPRA dell’ENEA, dove sono state sviluppate procedure per la biopulitura basate sull’uso di specifici batteri, appositamente selezionati per la pulitura di diversi materiali, quali carta, marmi, dipinti murali, e per la rimozione di diverse tipologie di depositi quali colle, caseina, cere e resine, gesso, carbonati, apatiti, inquinanti ambientali e depositi misti.

In linea con le strategie più innovative riguardanti la conservazione dei beni culturali, le ricerche realizzate dall’ENEA intendono esplorare il mondo microbico per selezionare microrganismi e relativi prodotti metabolici in grado di rispondere alla sfida di andare verso la sostituzione di prodotti tossici con prodotti con azione selettiva non aggressivi nei confronti delle opere d’arte, innocui per la salute degli operatori, compatibili con l’ambiente ed economici.

Tra le attività ENEA nell’ambito del biorestauro va segnalata la biopulitura di depositi da dipinti murali delle logge di Casina Farnese, nel sito archeologico del Colle Palatino a Roma che ha condotto allo sviluppo di un brevetto ENEA.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Il procedimento del brevetto del Dott. Giuseppe Cotellessa può risultare utile nel brevetto ENEA descritto in questo commento.



Nuovo brevetto ENEA per la diagnosi della combustione nei combustori



L’ENEA ha depositato un nuovo brevetto che riguarda un sistema per la diagnosi dello stato di combustione all’interno di combustori.

La metodologia diagnostica ODC (Optical Diagnostics of Combustion), già sviluppata in ENEA nella sua fase embrionale in due precedenti brevetti, è basata sull’analisi delle emissioni ottiche di una fiamma e permette di individuare anomalie funzionali sia di tipo fluidodinamico che termoacustico, in quanto riesce a individuare, attraverso l’analisi del segnale acquisito, i fenomeni precursori di questi eventi, consentendo in tal modo un tempestivo intervento sul processo al fine di ripristinare condizioni di combustione stabile.

Il sistema ODC pertanto permette di:

analizzare e ottenere una maggiore comprensione del processo di combustione;
risolvere le problematiche di instabilità aumentandola sicurezza dell’impianto;
aumentare il rendimento dell’impianto, incrementando grandemente la flessibilità di esercizio.

Il sistema ODC, in quanto sistema di diagnostica, monitoraggio e possibile supporto al controllo della combustione, può trovare applicazione commerciale in tutti quei settori basati su processi di combustione, che hanno la necessità di dotare i loro impianti di nuove tecnologie finalizzate ad una riduzione dei costi e dei rischi di esercizio e ad un miglioramento delle prestazioni, quindi in particolare:

nella produzione di energia;
nella termovalorizzazione dei rifiuti solidi urbani;
nella propulsione aeronautica e spaziale; in tutti quei settori industriali (siderurgia, cemento, vetro, ecc.) ove una combustione stabile e controllata rappresenta un indiscutibile valore aggiunto.
Un ulteriore segmento di interesse è quello della ricerca, in cui rientrano tutti quei soggetti pubblici o privati che svolgono attività di ricerca nel settore della combustione e che possono essere interessati al prodotto in quanto “strumento” per la comprensione dei meccanismi cinetici e termo-fluidodinamici con livelli di analisi superiori rispetto ai sistemi attualmente disponibili sul mercato.

Gli inventori del brevetto sono: Emanuele Giulietti, Caterino Stringola, Eugenio Giacomazzi, Mirko Nobili.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Georgia Institute of Technology, Joint BioEnergy synthesize pinene



Researchers at the Georgia Institute of Technology and the Joint BioEnergy Institute have engineered a bacterium to synthesize pinene, a hydrocarbon produced by trees that could potentially replace high-energy fuels, such as JP-10, in missiles and other aerospace applications. With improvements in process efficiency, the biofuel could supplement limited supplies of petroleum-based JP-10, and might also facilitate development of a new generation of more powerful engines. However, these process inhibitions will be challenging to address:

“We found that the enzyme was being inhibited by the substrate, and that the inhibition was concentration-dependent,” said assistant professor Pamela Peralta-Yahya. “Now we need either an enzyme that is not inhibited at high substrate concentrations, or we need a pathway that is able to maintain low substrate concentrations throughout the run. Both of these are difficult, but not insurmountable, problems.”

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Siluria Technologies unveils next step in natural gas-to-liquid fuels



Fuels based on Siluria’s technology projected to “dramatically reduce the cost, complexity and overall emissions” of transportation fuels.

In California, Siluria Technologies debuted its development unit for producing liquid fuels from natural gas based on Siluria’s proprietary oxidative coupling of methane (OCM) and ethylene-to-liquid (ETL) technologies.

Earlier this year, Siluria announced that it will build an OCM demonstration plant at Braskem’s site in La Porte, Texas. Braskem is one of the leading producers of ethylene and plastics in the Americas. Siluria and Braskem have also entered into a relationship to explore commercialization of this technology. The OCM demonstration plant will begin operations later this year.

Siluria’s Hayward ETL facility and the La Porte OCM demonstration plant are the last scale-up steps prior to full commercialization of Siluria’s technology platform, which is now planned for the 2017 time frame.

Siluria plans to deploy its technology in a range of commercial settings, including existing ethylene producing plants, at ethylene consuming sites, upstream gas monetization, natural gas midstream plants, as well as world-scale deployments. Siluria process dramatically reduces the cost, complexity and emissions associated with the production of these higher value products across the energy spectrum.

Siluria’s OCM and ETL technologies are a means of transforming methane—the principle ingredient in natural gas and renewable methane—into gasoline, diesel, jet fuel and other liquid fuels. Unlike the high-temperature, high-pressure cracking processes employed today to produce fuels and chemicals, Siluria’s process employs catalytic processes to create longer-chain, higher-value materials, thereby dramatically reducing operating costs and capital.

At commercial scale, Siluria’s process will enable refiners and fuel manufacturers to produce transportation fuels that cost considerably less than today’s petroleum-based fuels, while reducing overall emissions, NOx, sulfur and particulate matter.. Fuels made with Siluria’s processes are also compatible with existing vehicles, pipelines and other infrastructure and can be integrated into global supply chains.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

On Demand Solar Power



thin hybrid silicon/organic solar cell developed at University of Michigan goes beyond basic black — these transparent devices are fabricated in various colorsto be decorative energy generators. Different thicknesses of amorphous Si yield different colors: 6 nm thick for blue, 11 nm for green, and 31 nm for red. The Si layer is encased between semi-transparent electrodes in an arrangement that transmits certain light wavelengths. A cell resembling an American flag posted a 2% efficiency, underscoring the concession between visual appeal and utility. But the opportunity to widen the scope of PV placement through color could offset efficiency concerns.

A new look at how to harvest the energy of the sun is taking shape with a team from MIT. They've found a means to not only create electricity, but to store it for later or for use on demand, by adding a two-layer absorber-emitter device between the PV cell and the light source.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Transcending Graphene



Graphene may be the single atom-thick miracle material for electronics, but another ultrathin offering holds greater promise for PV applications. Tungsten diselenide absorbs 5% of incident solar radiation and converts one-tenth of this into power. Vienna University of Technology, Austria, researchers envision using stacked layers as transparent, flexible cells for use in building facades or displays.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

US Navy tests robotic fire-fighters



Fire-fighting robots designed to withstand intense heat are to be tested by the US Navy this summer.

The Shipboard Autonomous Fire-fighting Robot (SAFFiR) has been built by engineers at Virginia Tech and other US universities.

The robots are expected to perform a variety of tasks - balancing, turning valves, picking up and dragging a fire hose and jetting water on the fire.

They also have a vision system to search for survivors.

"The human-sized autonomous robot is capable of finding and suppressing shipboard fires and working seamlessly with human fire-fighters," says the Office of Naval Research's website.

Such a machine should be "able to withstand higher heat for longer periods than human fire-fighters," it adds.

Two versions of the robot, made by researchers at Virginia Tech and the universities of California, Los Angeles and Pennsylvania, will be tested on board the decommissioned USS Shadwell.

The ship is regularly set on fire to test new equipment.

One robot will be about 5ft (1.5m) while the other will be slightly taller and more advanced.



Synthetic soldiers



Robots are increasingly finding their way into the military. The Pentagon's Darpa (Defense Advanced Research Projects Agency) has a range of battlefield robots and is also working on ways to enhance soldiers' abilities with exoskeletons and uniforms made of smart materials.

This week it announced a new unit devoted to researching the intersection between biology and engineering.

It will look at creating man-made super-materials, renewable fuels and solar cells.

But it has led some commentators to ask if, longer term, the military will also try to create artificial life.

"It makes you think: Why bother with mechanical robots when you can engineer fake human replicants to fight your battles?" asked Meghan Neal. a journalist at Motherboard - a website dedicated to future technologies.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Robots clean solar panels in Israel without using water



At Ketura Sun, a large commercial solar field in Israel, the solar panels are being cleaned in a unique way: by robots. That's not the most unique part. These robots don't use any water in the cleaning process, making them a great match for the Negev desert where the solar plant is located. Even better, the robots could go a long way toward making solar power plants less dependent on water.

According to Gizmag, the Ecoppia E4 robots are "mounted on a frame that moves laterally along the panels and the robots themselves move up and down the panels. They use a rotating brush made up of soft microfiber in conjunction with air blowers to remove what Ecoppia says is 99 percent of dust build-up." No water required.

Other solar panel cleaning robots have been developed, even some that don't use water, but those are not being commercially used yet.

Keeping solar panels clean is a major necessity because dust covered panels don't produce as much energy (up to 35 percent less), but non-automated processes require a lot of manpower, time and money. The robots make it so that the panels are automatically cleaned nightly and are always operating at maximum output.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

In cooperativa l’energia è social

L’unione fa la forza – e il risparmio – anche quando di tratta di energia. Sbarcano in Italia le cooperative energetiche: tutto quello che c’è da sapere e come aderire.

L’energia diventa social. L’idea da cui nascono le cooperative energetiche è molto semplice: mettersi insieme per risparmiare. È la filosofia social, già alla base di fortunati esperimenti in diversi campi. Ora tocca all’energia, prodotta dal sole e quindi pulita. In questo caso le persone che aderiscono al progetto formano una “cooperativa” e insieme acquistano un impianto fotovoltaico già in funzione, in grado di produrre l’energia necessaria a soddisfare il loro fabbisogno. In base alle quote possedute, utilizzano l’energia rinnovabile prodotta per tutto l’arco di vita dell’impianto fotovoltaico acquistato.

Un modello virtuoso. Nuovo modello di investimento energetico, soluzione vantaggiosa per l’ambiente e per il risparmio in bolletta, il progetto delle cooperative energetiche è reale. In Europa è un modello già affermato: in Belgio, la Cooperativa Ecopower ha oltre 45.000 soci, 26 MW di impianti installati e 95 milioni di chilowattora all’anno di energia verde prodotta; in Germania sono oltre 80.000 le famiglie che partecipano a cooperative per l’autoproduzione di energia rinnovabile. Anche in Italia la tendenza è concreta e in crescita. Nella maggior parte dei casi si tratta di cittadini di uno stesso comune che si uniscono e acquistano impianti fotovoltaici presenti sul territorio. Esempio virtuoso è la Cooperativa Energyland di Verona, di cui fanno parte circa cento famiglie. Dal 29 luglio 2011 a oggi, la Cooperativa ha prodotto oltre tre milioni di chilowattora di energia rinnovabile.

Come si entra in una cooperativa energetica. Entrare a far parte di un progetto di cooperativa energetica è semplice. Raggiunto il numero di soci necessario per formare la cooperativa (che cambia di volta in volta in base alla grandezza dell’impianto), ogni utente sceglie la quota da acquistare in base al proprio fabbisogno energetico. Al termine della vita dell’impianto (circa 17 anni) i proprietari possono decidere se sottoscrivere un nuovo contratto o abbandonare il progetto.

Solar Share, una cooperativa energetica senza limiti territoriali. L’ultima novità del settore è Solar Share, progetto di cooperativa fotovoltaica lanciato da LifeGate, network di comunicazione sui temi ambientali, e da ForGreen, azienda che sviluppa soluzioni per le imprese impiegate nella Green Economy. Spiega Stefano Corti, direttore generale di LifeGate: “Solar Share è il primo progetto italiano che apre l’opportunità di aderire a una cooperativa energetica a tutti, indipendentemente dall’ubicazione. Persone di città differenti potranno unirsi per condividere energia prodotta dal sole, senza limiti territoriali”.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Aderire a Solar Share, passo dopo passo. Per aderire al progetto Solar Share bisogna compilare un modulo di preadesione presente sul sito e inviare una copia dell’ultima bolletta energetica e di un documento d’identità. Raggiunto il numero di preadesioni necessarie per la rilevazione dell’impianto, sarà costituita la cooperativa e si potrà procedere con l’acquisto delle quote. Possono aderire a Solar Share famiglie e piccole imprese la potenza dei cui contatori non superi i 12 chilowatt. Il numero delle quote acquistabili è pari alla potenza massima del contatore, arrotondata per eccesso. Ad esempio, per un contatore in grado di fornire fino a 3 chilowatt di elettricità, è possibile acquistare un massimo di 3 quote; per un contatore in grado di fornire fino a 4,5 chilowatt di elettricità, è possibile acquistare un massimo di 5 quote. Ogni socio ha diritto a ricevere un determinato quantitativo di energia elettrica l’anno, proporzionale alle quote possedute. Se sfora, deve saldare la sua posizione entro le tempistiche fissate.

Come lasciare il vecchio fornitore. Al vecchio operatore il socio non deve alcuna spiegazione (non ci sono, cioè, brighe burocratiche da affrontare): sarà il nuovo fornitore a comunicare al vecchio la richiesta del cambio. Durante il cambio è assicurata la continuità della fornitura: normalmente, il tempo necessario per completare il cambio di fornitore è inferiore ai 60 giorni. La cooperativa è proprietaria dell’impianto e cede l’energia prodotta a un operatore sul mercato dell’energia elettrica. I soci riceveranno mensilmente o bimestralmente, a seconda dell’offerta di mercato, la bolletta, nella propria abitazione o tramite posta elettronica. Sarà la cooperativa a pagare scalando l’importo dalla giacenza di energia disponibile per ogni socio.

A chi, quanto e perché conviene. I vantaggi dell’adesione a una cooperativa energetica sono numerosi. Se l’energia acquistata non è consumata entro l’anno, il corrispettivo economico in surplus sarà restituito con un conguaglio annuale. L’acquisto di quote di una cooperativa energetica è consigliato a chi non può mettere un impianto fotovoltaico sul tetto o a chi non gode di una posizione adeguata per l’installazione dei pannelli. L’acquisto delle quote è veloce, immediato e vantaggioso perché non richiede iter amministrativi e burocratici. Infine, è una scelta che può essere fatta da chi non intende sopportare gli oneri di gestione di un impianto, che prevede assicurazione e manutenzioni.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Aderire a Solar Share, passo dopo passo. Per aderire al progetto Solar Share bisogna compilare un modulo di preadesione presente sul sito e inviare una copia dell’ultima bolletta energetica e di un documento d’identità. Raggiunto il numero di preadesioni necessarie per la rilevazione dell’impianto, sarà costituita la cooperativa e si potrà procedere con l’acquisto delle quote. Possono aderire a Solar Share famiglie e piccole imprese la potenza dei cui contatori non superi i 12 chilowatt. Il numero delle quote acquistabili è pari alla potenza massima del contatore, arrotondata per eccesso. Ad esempio, per un contatore in grado di fornire fino a 3 chilowatt di elettricità, è possibile acquistare un massimo di 3 quote; per un contatore in grado di fornire fino a 4,5 chilowatt di elettricità, è possibile acquistare un massimo di 5 quote. Ogni socio ha diritto a ricevere un determinato quantitativo di energia elettrica l’anno, proporzionale alle quote possedute. Se sfora, deve saldare la sua posizione entro le tempistiche fissate.

Come lasciare il vecchio fornitore. Al vecchio operatore il socio non deve alcuna spiegazione (non ci sono, cioè, brighe burocratiche da affrontare): sarà il nuovo fornitore a comunicare al vecchio la richiesta del cambio. Durante il cambio è assicurata la continuità della fornitura: normalmente, il tempo necessario per completare il cambio di fornitore è inferiore ai 60 giorni. La cooperativa è proprietaria dell’impianto e cede l’energia prodotta a un operatore sul mercato dell’energia elettrica. I soci riceveranno mensilmente o bimestralmente, a seconda dell’offerta di mercato, la bolletta, nella propria abitazione o tramite posta elettronica. Sarà la cooperativa a pagare scalando l’importo dalla giacenza di energia disponibile per ogni socio.

A chi, quanto e perché conviene. I vantaggi dell’adesione a una cooperativa energetica sono numerosi. Se l’energia acquistata non è consumata entro l’anno, il corrispettivo economico in surplus sarà restituito con un conguaglio annuale. L’acquisto di quote di una cooperativa energetica è consigliato a chi non può mettere un impianto fotovoltaico sul tetto o a chi non gode di una posizione adeguata per l’installazione dei pannelli. L’acquisto delle quote è veloce, immediato e vantaggioso perché non richiede iter amministrativi e burocratici. Infine, è una scelta che può essere fatta da chi non intende sopportare gli oneri di gestione di un impianto, che prevede assicurazione e manutenzioni.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Challenge: Integrating Electronic and Mechanical Design



While they quantify trends, the key findings of many market surveys hold relatively few surprises. Still, there is the occasional eye-opening fact or two. Such was the case when Control Design released results of its survey to identify usage preferences and trends in motion, drives, and motor technology. The predictable numbers: engineers say they primarily use servo motors (69%), prefer digital drives (67%), and rely on closed-loop over open-loop systems. Unexpectedly, after decades of wrestling with mechatronics, engineers still identified integrating electronic and mechanical components as their lingering and biggest challenge in motion system design.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Pitch Control Synchs Turbine Blades



Stopping a wind turbine requires rotating all blades on their axes to the zero pitch or "feathered" position, synchronously. Various technologies exist to address the problem, but all suffer the same disadvantage: each axis depends on the load supplied by the blade. Moog says its Pitch Axis Servo technology overcomes the problem and delivers triple the startup torque of current systems

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Pompei, satelliti e sensori, il ''grande occhio'' sugli scavi



Satelliti per il monitoraggio dal cielo e una rete di sensori a terra per tenere sotto controllo crolli, smottamenti, condizioni del sito e stato di salute di pitture e affreschi. Firmata la convenzione con il ministro della cultura Dario Franceschini, parte a Pompei il progetto finanziato da Finmeccanica. Realizzato con le competenze di Telespazio e Selex Es, entrambe società del gruppo, l'intervento sarà pienamente operativo tra settembre e dicembre 2014 , con l'assistenza Finmeccanica assicurata per 3 anni e un costo di 1 milione e 700 mila euro, interamente sostenuto dalla Holding.

''Non è una sponsorizzazione ma un vero atto di liberalità'', sottolinea il ministro. Che sottolinea come questa esperienza possa essere ''allargata ad altri siti'' e rilancia il suo appello a imprenditori e ''altri gruppi privati disponibili a dare una mano concreta''. ''Si facciano avanti e non avanzino l'alibi degli ostacoli burocratici'', dice.

Per Pompei , intanto, la presentazione dell'accordo con Finmeccanica è l'occasione per fare il punto sul Grande Progetto finanziato con il 105 milioni Ue. ''Ci sono 7 cantieri aperti, l'ultimo pochi giorni fa e altre sei gare aggiudicate più una in corso di aggiudicazione'', spiega il generale Gianni Nistri, direttore generale del progetto. Nistri sottolinea che al momento ''sono state bandite gare per 40 milioni di euro''. I tempi sono stretti perché gli accordi con la Ue prevedono che i cantieri si chiudano entro la fine del 2015. Franceschini è ottimista : '' A Pompei ci sono anni di ritardo, ma la squadra di lavoro che abbiamo messo in campo e le cose che stiamo facendo ci consentiranno di affrontare le verifiche finali a testa alta. Giudicateci alla fine''.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Sensor Technology Automates Feeding Control in Single Use Bioreactors

Bioprocessing dates back over 100 years, in fact, beer manufacturing is a bioprocess operation and original bioreactors could be as simple as oak casks. The more recent history of bioreactors is dominated by either glass reactor vessels, which are typically used in process development and sized at 2 to 25 liters, or stainless steel, which are used in pilot plants and manufacturing.

Stainless steel reactors can range from a few 100 liters to 10,000 liters or more. Back in the 1980s, control of these reactors was mainly via off-line sampling and analog controls. By the 1990s, the vessels were very much the same; however control had become digitally based. Participants were converting a number of off-line measurements, pH for example, to in-situ, real time measurement.

Today, we see an industry where the process and media are much more complex, development times are extending and the rate of drug discovery is increasing. At the same time stainless steel facilities require high capital investment, significant build and commissioning times and are inflexible to changes to output demand. They do have the advantage of re-usability but, in many cases, this is more than offset by the cost of cleaning, sterilization and maintenance.

In an effort to decrease the cost base and increase flexibility a paradigm shift is underway employing single-use, disposable technologies, which require significantly less up-front investment, are highly modular and are faster to implement. The 'new' bioreactor is a sterilized, disposable, plastic bag. Single use reactors are currently sized up to 2,000 liters and require no on-going maintenance. Forecasted growth in single use for manufacturing is expected to be 55% per annum over the next three years in a market valued at $2.8B by 2016. The paradigm shift from re-useable stainless and glass to single-use is a significant response to the industry challenges.

As the industry moves to disposable reactors for manufacturing it is also seeking to reduce its process development costs and timescales. This is leading to the adoption of highly parallel mini-reactors, which can be less than 250 ml and look similar to Tic-Tac boxes. These systems allow process development engineers to run multiple concurrent process 'trials' as they seek to optimize their process. These fundamental changes in both process development and manufacturing highlight a number of control and measurement issues.

Firstly, C&M sensors now need to fit into much smaller reactor vessels. Secondly, they need to be at a 'disposable' price point and, thirdly, they must be usable across the entire range of reactor vessels. Some of the existing technologies may make the transition to this new operating model. Others may be more limited in their use.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Finally, as well as the physical requirement imposed on sensors by this paradigm shift, the industry is looking for sensors and systems which offer new measurement parameters to complement their existing techniques and allow users better process control and ultimately higher process yields and shorter development times. That is where a micro-optical sensor is proving to be of great value.

Working with solid state silicon and fiber optic technology, Stratophase, for example, has developed a micro optical sensor that is truly optimized for real-time, in-situ bioprocess control and monitoring across the full reactor range. A patented manufacturing process produces optical structures that route light around a small silicon chip. By immersing the sensor in the process media the refractive index of the media is measured directly. This real time measurement responds to the metabolic rate, any nutrient additions and indicates the end point of the process.

Stratophase micro optical sensor capable of
performing real-time, in-situ bioprocess control
and monitoring across the full reactor range.

The micro-optical sensor is part of the Ranger System, which is composed of two components: the Ranger Manager unit providing controls and the Ranger Probe, a real-time sensor residing within the process media. Communication between the two devices is via noise-immune fiber optic cable. Manager contains a laser light source, detectors, data reduction components, communication modules, and user interface components. The system features standard outputs such as OPC and 4 mA to 40 mA to enable communication to process-control systems.
A two-part paradigm, the Ranger System employs the Ranger Manager for control functions and the Ranger Probe, a real-time sensor.

The micro-optical sensor and its control system enable process development teams to rapidly optimize their reactor feeding strategy which is then used in manufacturing independent of the reactor style. As a small silicon device it meets key parameters of size and cost and provides a new, high value, measurement parameter to the industry.

Conclusion

The biotechnology industry is implementing significant changes in the type and usage of bioreactor vessels. The deployment of single use system is advancing rapidly. This presents both a challenge and an opportunity for new technologies to emerge, which are useable in all bioreactor form factors.

Additionally, increasing complexity, higher value products, and regulatory directives demand advances be made in process control. Processing equipment is leading the way with smaller, massively parallel reactors and single-use systems. Control and measurement technologies must advance rapidly to track these trends and offer new capability, which directly contribute to the profitability and cost effectiveness of the industry.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

GE using medical X-rays to inspect undersea pipelines



Using X-rays and other forms of radiation has been a standard tool for testing pipelines for decades, but until now it's been largely confined to factories and land-based pipelines instead of the deep seabed. That’s changing as GE adapts its medical X-ray systems to work in the crushing pressures of the deep oceans, as part of a remote-controlled submersible rig for examining pipelines in place.

Pipelines are a vital part of the modern economy. Carrying oil, natural gas, and even water, there are a surprisingly large number of them running for thousands of miles under the oceans of the world. The problem is, they need periodic inspection to make sure there aren’t any structural flaws, that corrosion hasn’t taken hold, and that valves are still working properly.

With land-based pipelines, this is a pretty straightforward, albeit time-consuming task. In the bathypelagic zone of the deep oceans, however, it's another matter. At a depth of 10,000 ft (3,048 m), the pressure is 300 atmospheres or 4,400 lb/sq in (309 kg/sq cm), and the temperature drops to 4ºC (40ºF). That’s not only hard on the pipelines, but also on any testing gear sent down to inspect them, and especially on electronics. The result is a logistical nightmare with costs many times that of inspecting land pipelines.

One way of testing the pipelines is to use radioactive isotopes to beam gamma rays through the pipe sections. This is relatively simple and the radiation source is self-generating, but the shielded casings or “bombs” holding the isotopes are heavy, bulky, and difficult to handle. X-ray machines are more compact, lighter, and give better resolution, but they need power and the electronics have to be guarded against pressure and seawater, yet kept cool enough to operate properly.

For engineers from GE Healthcare, GE Oil & Gas, BP, and marine engineering firm Oceaneering, their task was essentially one of taking GE’s medical X-ray detector, disassembling it, and re-engineering it so it could work inside a marinized pressure housing. Of particular importance was protecting the Digital Detector Array (DDA), which produces the radiographic image. This is a glass screen about the size of a computer monitor, and it's very fragile, hence the need for a casing to protect it from pressure and contact with seawater.

The Digital Detector Array (DDA) (Image: Oceaneering)

According to Oceaneering, the medical X-ray detector provides marine engineers with better image contrast, the ability to estimate pipeline wall thickness, real-time data transmission, wider X-ray exposures, and the ability to handle most pipe sizes, which means the pipes don’t need to be stripped of their protective coatings for testing.

The device is designed to fit inside a handling machine, which is attached to a deep-sea submersible rig. This rig latches around the pipeline and slides along it as it takes incremental X-ray images. It looks for signs of erosion or corrosion, foreign objects, blockages, or valve problems.

“This is not what we normally do,” says mechanical engineer Karen Southwick at GE Healthcare. “X-rays are giving us an insight. You don’t know something’s wrong and then you see it. Whether it’s a small spill or a catastrophic one, this is hopefully preventing that. Ideally, we will be able to have data for every pipeline that’s in the water.”

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Indian Government Aims For 26 Million Solar Water Pumps


The Indian government is aiming to swap out 26 million fossil-fuel-powered groundwater pumps for solar-powered ones.

The pumps are used by farmers throughout the country to pull in water for irrigation, and currently rely on diesel generators or India’s fossil-fuel-reliant electrical grid for power. Pashupathy Gopalan, the regional head of SunEdison, told Bloomberg that 8 million diesel pumps already in use could be replaced right now. And India’s Ministry of New and Renewable Energy estimates another 700,000 diesel pumps that could be replaced are bought in India every year.

“The potential is huge,” said Tarun Kapoor, the joint secretary at the ministry. “Irrigation pumps may be the single largest application for solar in the country.”

The program works by subsidizing the swap, and operates in different capacities in India’s various states, sometimes subsidizing the solar pumps up to 86 percent. Thanks to that aid, and the dramatic collapse in prices for solar power, the pumps pay themselves off in one to four years, according to Ajay Goel, the chief executive officer of Tata Power Solar Systems Ltd., a panel maker and contractor. And Stephan Grinzinger, the head of sales for a German solar water pump maker, told Bloomberg the economics will only get better: diesel prices will rise and spike during farming season, and economies of scale will help the swap program.

Two-thirds of India’s electricity is generated by coal, with natural gas and hydroelectric making up most of the rest. But the monsoon season is growing more erratic — likely due to climate change — making power from the hydroelectric dams less reliable as well. Coal is growing in economic cost for India, so power plants often sit idle, and the coal that is easy to reach would require displacing major population centers.

The national grid that relies on those fuels has seen few updates since it was constructed in they 1960s. It’s also under growing stress from India’s rising middle class, which is adopting air conditioning and running water in massive numbers — all in a country prone to heat waves, again thanks in part to climate change. As backup, many Indian residents and businesses rely on diesel generators, which leaves them vulnerable to the fuel market and contributes to fossil fuel emissions.

Even when the grid is working, around 300 million of India’s 1.2 billion inhabitants don’t have access to it. When it’s not, rolling blackouts are common. Many farmers are able to draw only four hours of power a day from the grid, and that often at night. Heat waves in 2013 were accompanied by widespread blackouts, and a two-day grid failure in 2012 left over 600 million Indians without power.

Ironically, thanks to the kind of distributed and sustainable generation the swap program represents, many of India’s rural poor actually faired much better during the blackout than the grid-dependent middle-class. It’s one of the strengths of solar in particular, even before climate change is considered: a more decentralized power system, based around “microgrids” and individual power generation, rather than a centralized system reliant on the good function of large, singular power providers. In India in particular, sunlight is most plentiful at the times when demand tends to peak. That leaves the power system more adaptable, less prone to central failures, and thus more hospitable to those still struggling to overcome poverty in particular.

Beyond India’s pump swap program, other efforts in south Asia and northern Africa are already underway to bypass grid expansion entirely, and bring solar power and microgrids directly to poor people.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Modern Polymeric Materials Offer Options for Equipment Repair



Continual development of polymer technology has enabled the creation of specialized coatings, which can offer excellent resistance to erosion, corrosion, and cavitation in hydroelectric equipment and pumps at any generating plant. Polymeric materials can also increase efficiency and extend runtimes.

Currently accounting for over 16% of global energy production, and with an expected growth rate of 3% per year for the next quarter century, hydroelectric power generation continues to grow as the front runner in renewable energy, even though growth in the U.S. is expected to be minimal.

In recent years, maintenance of existing hydroelectric assets has become increasingly important to ensure a consistent supply of power. Low water levels, due to factors such as drought and higher local demand for water (see “Water Issues Challenge Power Generators” in the July 2013 issue ofPOWER, online at powermag.com), have resulted in decreased production in high-profile hydroelectric stations, such as the Hoover Dam. There the problem has become so severe that the resulting drop in pressure difference has caused increased cavitation damage to turbine runners and a 20% decrease in production levels.

Ensuring turbine efficiency and up-time are at their maximum is key to achieving optimum production. However, as with any fluid flow equipment, the effects of erosion and corrosion will detract from this. If left unchecked, erosion—and specifically, cavitation damage—rates increase exponentially to cause severe metal loss. Unbalancing and vibration of turbine runners can result, requiring lengthy shutdowns for repair work to shafts and bearings. Loss of surface smoothness also results in increased turbulent flow and lower production rates.

Traditional Repair Techniques

The recommended procedure for determining inspection and repair frequency for hydroelectric runners and turbines, including stay vanes and wicket gates, is to inspect the equipment at set intervals following installation to ascertain the rate of damage, including erosion, corrosion, and cavitation. Once the rate of damage is known, procedures are put in place to repair the damage once the depth of metal loss reaches predetermined levels.

Once a maintenance routine is put in place, repairs are carried out in accordance with the recommended procedure. The procedure is often to replace the lost metal using conventional metal replacement techniques. Large areas of pitting are repaired by welding plates or sheets of new metal in place as an erosion wear layer, whereas areas of lighter damage are repaired by weld overlay, which is then ground back to the correct tolerance. The procedure is repeated at the next service interval, as dictated by the rate of in-service deterioration.

Limitations of Traditional Repairs

The traditional repair procedure is not without problems though. The most basic flaw is the replacement of the material that is being lost with more of the same material—a like-for-like repair. Reintroducing the same base material simply allows the problems to reoccur and does not identify the root cause of the issue and work to limit its effects. Continued metal loss will result in continued shutdowns. As previously discussed, metal loss will in some cases result in vibration due to imbalance, and this can cause damage to bearings and shafts.

One of the major drawbacks of using hot work to replace lost metal is the procedure involved in implementing the repair. According to the Facilities Instructions, Standards, & Techniques Turbine Repair manual, “Extensive weld repairs can result in runner blade distortion, acceleration of further cavitation damage, and possible reduction of turbine efficiency. Also, extensive repair can cause residual stressing in the runner resulting in structural cracking at areas of high stress.”

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

This Plane Will Circle the World Using Only the Power of the Sun

You've probably heard about the ambitious, almost impossible-sounding project to fly a solar-powered plane around the world without refueling. But now, about a year before the voyage is scheduled to begin, you get your first look at the plane itself. It's unlike any plane you've seen before.

Behold: the Solar Impulse 2. With a wingspan of 236 feet, it's broader than a 747 but contains a cockpit large enough for just one person. The wings are completely covered in 17,248 solar cells, each as thin as a human hair. The fuselage and components, meanwhile, are made using an innovative, mass produced carbon fiber technique, and the whole thing is covered in carbon fiber sheets that are three times lighter than paper. The plane is powered by four ultra-light electric motors that are over 90 percent more efficient than standard thermal motors.

So it's light. In total, it weighs just 5,000 pounds. To use the same point of comparison, a 747 weighs a staggering 875,000 pounds. Of course, megajets can carry a lot more cargo, but that's not the point. The Solar Impulse 2 is designed for maximum energy efficiency.

Next March, the plane will take off from the Persian Gulf, and fly over India and China before beginning its marathon journey across the Pacific. It will fly at an altitude of 28,000 feet during the day, when the sun is pumping those solar cells full of juice, but will drop down to 16,000 feet at night to conserve energy. While the plane will touch down to change pilots from time-to-time, it must make it all the way across both the Pacific and the Atlantic without any breaks. For the pilot, this means five to six days in the air—the Solar Impulse 2 only goes about 40-miles-per-hour—practically without sleeping. This will not be easy, but they've been testing different techniques to ensure that the pilots, Bertrand Piccard and André Borschberg, can endure it.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

SECONDA PARTE

The cockpit itself is designed with these long hauls in mind. In order to cut down on weight, some sacrifices had to be made. For instance, there's no heat or air conditioning, a real bummer since the air temperature outside the plane will yo-yo from -40 degrees to 100 degrees Fahrenheit. There is a rigid, highly insulating foam that should help shield the pilots a bit, but it will inevitably be an uncomfortable ride from time to time. There's also no beverage service on the flight. The pilots will eat specially designed food and drink from a straw. When it's time to go to the bathroom, they'll just go. There's a toilet built into the specially designed reclining chair.

Again, the biggest challenge on the journey are those days-long jaunts across the ocean. The pilots will get to take cat naps every few hours but will be trained to wake up, fully alert, at any sign of trouble. Because of weight constraints, there won't be a full-fledged autopilot on board, so any time the wings dip more than 5-degrees, the sleeve of the pilots' custom-made flight suits will vibrate so they can course correct.

But hopefully they won't. Already several years in the making, the Solar Impulse project brings together some of the most prominent Swiss and European companies in the world, from Omega watches to Schindler elevators. Even Google's thrown some chips in the pot, offering up the Hangouts platform during the project. If this had happened half a century ago, you'd expect some crazy genius like Howard Hughes to be behind it, but it is, in fact, a well thought-out, corporate-funded bonanza that everyone expects will produce innovations that can later be brought to market. You know, in case you want your own giant, solar-powered plane with a toilet in the cockpit.

In all seriousness, the innovations already made on everything from carbon fiber to solar cells are pretty damn exciting. When this thing finally flies next year, it'll show the world what's really possible with solar-powered vehicles. In the meantime, the suspense is just killing us. [

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

New DeLUX Integrating Sphere Design



StellarNet's new IS12 is designed to conduct NIST Traceable total Watt flux measurements. Each sphere now features dual baffles, an integrated auxiliary lamp, adaptable sockets, AC-DC voltage compatibility, and Barium sulfate high reflectivity coating for cutting-edge performance at an affordable price.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Setting the Stage for Smart Windows



A small solar start-up in Colorado develops smart windows that use thermochromatic filter technology to regulate light, heat, and glare. The company uses Linkam Scientific Instruments' thermoelectrically-cooled stage, imaging station, and software to analyze the liquid crystal coatings used in the smart windows. Specifically, they use the imaging system to characterize the liquid crystal material and to find the right mix of materials to ensure the appropriate level of tinting in the window.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

What are Super Superconductors Made Of?



Scientists are looking into the inner workings of new materials by using ultra-fast laser spectroscopy. They used the lasers to study the minute and sometimes messy dynamics at work in superconducting materials as they transition from the magnetic state to the cooled superconducting state. The spectrometer works like a high-speed camera, taking many images of the material over time, so the researchers can better understand the material's behavior.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Probes Promise New View of Neurons



A multidisciplinary collaboration of scientists at the Lawrence Berkeley National Laboratory is working on ultra-bright nanoprobes expected to enable optical imaging of brain neurons. Upconverting nanoparticles, which absorb two or more photons at a lower energy and emit them at higher energies, when doped with ytterbium and erbium can be excited using near-infrared light, causing them to luminesce and making them useful for imaging biological cells. The team is working on making the probes smaller, less than 5 nm, and are adjusting the amounts of ytterbium and erbium to find the optimal concentration.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Routine Inspection? DM2700M the ideal microscope



This new Materials Microscope with LED Illumination is designed for routine inspection tasks in metallography, earth science, forensic investigation, and materials quality control and research. It offers users a state-of-the-art universal white-light LED illumination with high-quality Leica optics.

The ultra-bright, high-power LED illumination provides users a constant color temperature of 4500 K for brightfield, darkfield, interference contrast and polarized light. In addition, users can experience improved surface inspection details with the built-in oblique illumination technology.. Real color imaging can be obtained with all brightness intensity levels and lower cost of ownership is realized with the long lifetime and low power consumption of LED illumination technology

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Filter helps recover 80% of gold in mobile phone scrap



Mobile phone scrap can contain precious metals, such as gold and copper. VTT Technical Research Centre of Finland has developed a biological filter made of mushroom mycelium mats enabling recovery of as much as 80% of the gold in electronic scrap. Extraction of copper from circuit board waste, on the other hand, can be enhanced significantly by flotation the crushed and sieved material.

Although research into the biological methods is active, these are still rarely used in metal recovery chains. In a European "Value from Waste" project, VTT developed both biological and mechanical pre-treatment methods for more efficient recovery of precious metals from electronic waste. Other methods developed by researchers included recovery of gold from dissolved materials by biosorption and extraction, using as few harmful chemicals as possible.

Fungi catch gold and filter out impurities

VTT has developed a method that harnesses biosorbents, such as fungal and algae biomass, for the recovery of precious metals converted to a solution. In VTT tests, more than 80% of the gold in the solution adhered to the biomass, compared with only 10–20% of the harmful process chemicals.

The uniqueness of the method lies in the structure of the biomass. Different filament structures can be formed, for example, into biological filters, which makes further industrial processing of precious metals easier.

Gold also separates well in liquid-liquid extraction

The project developed a method with high extraction capacity for gold recovery, using the newest environmentally-friendly extraction reagents. In VTT experiments, it was possible to recover more than 90% of the metal solution dissolved from a circuit board with the help of functional ionic liquid. The method facilitates extraction of desired components from impurities.



The new pre-treatment methods developed by VTT allow separation of most plastics and ceramics from waste. In VTT experiments, cell phones were crushed and the particles sieved and separated magnetically and by eddy current into circuit board fraction. Treating once more by crushing, sieving and flotation, resulted in a fraction with high concentration of valuable metals for solution extraction experiments. Flotation raised the copper content of circuit board fraction from 25% to 45%, while gold content increased by a factor of 1.5.

"Value from Waste" project

The growth of cleantech industry, the rise in the world market prices of metals, and concentration of metal production in China have resulted in a situation in which extraction of several metals from waste streams has become advisable even in Finland. Ever stricter recycling and utilisation rates for electronic waste are also pushing the development of recycling technologies. The purpose of the EU project "Value from Waste" was to develop recovery processes on a more sustainable basis, to clean materials of impurities that reduce opportunities for further use, and to increase the amount of recovered materials.

The methods developed in the project included mechanical pre-treatment, solution extraction, use of biological methods, and optimisation of treatment chains. The new treatment methods will enable the metal refining industry to use cleaner electronic waste in larger amounts.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Solar Uses Less Silver



Solar power provides just a small percentage of total energy usage in Europe, and some think that's due to cost. What if solar panels were rendered more cost-effective by using fewer precious metals in their manufacture? Swiss researchers are working on panels that use more copper and less silver. In this video

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Protecting Critters from Power Lines



Ameren Illinois is just one of many utilities instituting programs to help reduce the number of animals getting electrocuted on power lines. Not only is it a tragedy to see wildlife killed needlessly, steeper fines are being imposed on utilities for the deaths. Midwest Energy News shares what steps Ameren and others are taking to make things safer.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

This Foundation Really Sucks



As Dong Energy builds its Borkum Riffgrund 1 wind farm, the company aims to try out a new installation technology: suction buckets. By using a vacuum-assisted installation method rather than conventional monopoles to place turbines in the water, the company can use higher-wattage turbines for the farm. The technique may spread to other projects if it's successful.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

NASA Looks to Go Beyond Batteries for Space Exploration



NASA is seeking proposals for the development of new, more capable, energy storage technologies to replace the battery technology that has long powered America's space program.

The core technologies solicited in the Wednesday call for proposals will advance energy storage solutions for the space program and other government agencies, such as the Department of Energy's Advanced Research Projects Agency (ARPA-E) through ongoing collaboration with NASA and industry.

"NASA is focusing on creating new advanced technologies that could lead to entirely new approaches for the energy needs of the agency's future Earth and space missions," said Michael Gazarik, associate administrator for space technology at NASA Headquarters in Washington. "Over the next 18 months, NASA's Space Technology Mission Directorate will make significant new investments that address several high priority challenges for achieving safe and affordable deep-space exploration. One of these challenges, advanced energy storage, offers new technology solutions that will address exploration and science needs while adding in an important and substantive way to America's innovation economy."

NASA's solicitation has two category areas: "High Specific Energy System Level Concepts," which will focus on cell chemistry and system level battery technologies, such as packaging and cell integration; and, "Very High Specific Energy Devices," which will focus on energy storage technologies that can go beyond the current theoretical limits of Lithium batteries while maintaining the cycle life and safety characteristics demanded of energy storage systems used in space applications.

Proposals will be accepted from NASA centers and other government agencies, federally funded research and development centers, educational institutions, industry and nonprofit organizations. NASA expects to make approximately four awards for Phase I of the solicitation, ranging in value up to $250,000 each.

Through solicitations and grants, NASA's investments in space technology provide the transformative capabilities to enable new missions, stimulate the economy, contribute to the nation's global competitiveness, and inspire the next generation of scientists, engineers, and explorers.

The Advanced Energy Storage Systems Appendix is managed by the Game Changing Development Program within NASA's Space Technology Mission Directorate (STMD), and is part of STMD's NASA Research Announcement "Space Technology Research, Development, Demonstration, and Infusion 2014" (SpaceTech-REDDI-2014) for research in high priority technology areas of interest to NASA.

The SpaceTech-REDDI-2014-14GCDC1 Advanced Energy Storage Systems Appendix is available through the NASA Solicitation and Proposal Integrated Review and Evaluation System at:

http://go.nasa.gov/ru9LgH

NASA's Langley Research Center in Hampton, Va., manages the Game Changing Development Program for STMD. STMD remains committed to developing the critical technologies required to enable future exploration missions beyond low-Earth orbit. The directorate continues to solicit the help of the best and brightest minds in academia, industry, and government to drive innovation and enable solutions in a myriad of important technology thrust areas. These planned investments are addressing high priority challenges for achieving safe and affordable deep-space exploration.

http://www.nasa.gov/spacetech

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Cree è la prima a infrangere la barriera dei 300 lumen per watt



Cree ha raggiunto un altro importante traguardo nell’industria dei LED raggiungendo i 303 lumen per watt per un LED bianco. Questo risultato, conseguito più velocemente di quanto si credeva possibile, supera il record di 276 lumen per watt ottenuto solamente poco più di un anno fa.

Cree ha misurato l’efficienza dei LED pari a 303 lumen-per-watt con temperatura di colore di 5150K e corrente di pilotaggio di 350mA. Per ottenere questi risultati è stata considerata una temperatura ambientale standard.



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

What is Magnetic Refrigeration?



Resisting most attempts to significantly improve efficiency, the refrigerator remains the household's biggest electricity consumer, or close to it. GE, however, is looking to break the efficiency barrier with magnets. They call their technology "magnetocaloric refrigeration" and hope to bring it to market by the decade's end.



Your next fridge could keep cold more efficiently using magnets

The fridge is the most common of common household appliances. Despite improvements in efficiency over the years, they remain one of the biggest users of electricity in the home, relying on chemical refrigerant and a compressor to transfer heat from the inside to the outside of the fridge. GE researchers have now developed a new type of refrigeration technology using magnets that is more environmentally friendly and is predicted to be 20 to 30 percent more efficient that current technology ... and it could be in household fridges by the end of the decade.

Magnetic refrigeration is not a new idea. Ever since German physicist Emil Warburg observed in the 1880s that certain materials changed temperature when exposed to a changing magnetic field – known as the magnetocaloric effect – there have been efforts to create refrigerators based on the technique.

Such magnetic refrigeration systems were developed as far back as the 1930s, and researchers at the Los Alamos National Laboratory (LANL) in New Mexico successfully achieved a few degrees of refrigeration in the 1980s. However, the technology has failed to make it into household refrigerators as it relies on superconducting magnets that themselves need to be cooled to extremely low temperatures, making it not cost- or energy-efficient for household use.

GE teams in the US and Germany turned their collective efforts to the task a decade ago and built a cascade from special magnetic materials. Each step of the cascade lowered the temperature slightly but after five years of work they were only able to realize cooling of just 2° F with a prototype that Michael Benedict, design engineer at GE Appliances, describes as a "huge machine."

A breakthrough then came courtesy of the research team's materials scientists who developed a new type of nickel-manganese alloys for magnets that could function at room temperatures. By arranging these magnets in a series of 50 cooling stages, the team have managed to reduce the temperature of a water-based fluid flowing through them by 80° F with a device that is, according to Benedict, "about the size of a cart."

"Nobody in the world has done this type of multi-stage cooling,” said Venkat Venkatakrishnan, a leader of the research team. "We believe we are the first people who shrunk it enough so that it can be transported and shown. We were also the first to go below freezing with the stages."

The team has demonstrated the system for experts from the Department of Energy (DoE), White House staffers and the EPA and is now working to further refine the technology. They hope to achieve a 100° F drop in temperature at low power, with the ultimate goal of replacing current refrigerator technology, possibly before the end of the decade.

"We’ve spent the last 100 years to make the current refrigeration technology more efficient,” said Venkatakrishnan. "Now we are working on technology for the next 100 years."



Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Make the world your office



It could be as simple as negotiating an elaborate deal with a businessman — even when you are sitting in an office on the other side of the world. Or it could be as complex as a hospital’s chief cardiologist making final checks before surgery — for a patient lying in an isolated village 500 miles away. Or it could be as personal as making sure you see your children when they get home from school — even though you are in a different time zone.



In the next few years, a technological leap in video communication is about to change the way we interact, work and live our lives.

Software such as Skype – which boasted 299 million users in June 2013 — and Apple’s FaceTime have already transformed personal relationships, meaning that families and friends can stay in touch from anywhere in the world.

But now teleconferencing — and its modern appellation, telepresence — is beginning to change our professional lives, too.

Working remotely
The last few years have already seen a sharp rise in “remote working” — employees working from home. According to a 2012 Ipsos/Reuters poll, around one in five people in the world frequently “telecommute” to work, and nearly 10% work remotely from home every day.

Meanwhile, companies expanding around the globe face pressure from shareholders and regulators to minimise staff travel between offices for meetings, enabling more energy-efficient choices and reducing expenses.

A proliferation of better-quality, cost-effective technology addresses this need. The TP3200 video conferencing system for example, was recently launched by communications technology provider Huawei Enterprise. It is designed especially for group meetings and uses a specialised image-processing device — a so-called co-optical centre camera — to provide the world’s first panoramic telepresence system.

“HD (high-definition)-resolution cameras and reliable broadband connectivity have vastly improved the teleconferencing experience,” says Jack He Liang, video conferencing director at the firm. “More people, and more employers, are beginning to see the benefits of video communication.

“Not only can it reduce costs, improve efficiency and save on human resources, but video conferencing also makes it easier to convey your idea. When you can communicate with your eyes and face, both sides [of the meeting] feel more confident.”

Teleconferencing no longer requires participants to submit to a fixed system of technical standards. Instead, many systems now utilise what is called BYOD.

“It’s short for ‘Bring Your Own Device,’” says He Liang. “The system recognises that many workers already own tablets, smartphones and laptops with cameras — and integrates them.”

The result is a step change in how we see, and carry out, our jobs.

SEGUE SECONDA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

So it is little surprise that global businesses are expected to spend $3.75bn (£2.3bn) on videoconferencing technology by 2016. Smart 2020, a study commissioned by the Global e-Sustainability Initiative, (a programme launched in 2001 to promote sustainable development in the ICT sector), found that US and UK businesses can save almost $19bn (£11.7bn) as a result of deploying 10,000 telepresence units by 2020.

By cancelling the need for long- and short-haul executive flights between offices, the same study suggested that teleconferencing technology and other virtual tools could reduce global annual greenhouse gas emissions by 15% by the year 2020.

But that is just the start. The next generation of video telepresence systems aims to fit even more seamlessly into our lives — by adding realistic 3D capabilities.

The third dimension
The technology, known as holographic telepresence, is more Star Trek than staff meeting. Instead of a flat screen, a three-dimensional moving image of a user is reproduced at each meeting location.

Currently the effect is not a true hologram. The technology company Musion, based in the UK, adapts an illusion commonly used in theatres and theme parks known as the Pepper's ghost effect. An HD projector illuminates a thin, effectively invisible, sheet-like “foil” from a 45-degree angle, creating a 3D image almost indistinguishable from an actual person. The Musion TelePresence system can now transmit full-sized people and objects in real-time “without any significant delay in communication”, the company says.

The system achieved a Guinness World Record in 2012 by helping Indian politician Narendra Modi deliver a 55-minute campaign speech to audiences in 53 different locations simultaneously. And in the entertainment world, Musion was one of three companies credited for digitally resurrecting rapper Tupac Shakur onstage at the 2012 Coachella festival.

But true holograms might not be far away. Leia Display System is currently working on an alternative technology. The Polish company has built a holographic room, measuring 3m (10ft) by 2.5m (8ft), which uses laser projectors to beam 3D images onto a thin cloud of water vapour — providing not only a giant 3D multi-touch screen, but also the ability to walk through the images and see them from any point of view.

SEGUE TERZA PARTE

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Holographic telepresence like this has obvious potential beyond from the business world. Healthcare is already being revolutionised by telemedicine, allowing patients to be treated remotely, whether in isolated areas or on a distant battlefield.

Hologram technology could also see advances in areas like education, enhanced movies, television programming, advertising, gaming, 3D mapping, aerospace navigation and robot control.

Still other improvements could be imminent. The Massachusetts Institute of Technology (MIT) in the US is examining how physical surfaces can be manipulated by gestures, with objects being resized, reshaped or moved remotely by people thousands of miles away. This offers the potential for virtual offices, where hundreds or even thousands of people could collaborate on a product without ever touching it.

Researchers at the University of Tokyo are also working on adding haptic feedback (vibrations, for example) to holographic projections by using ultrasound waves. A user can touch and interact with a hologram and receive tactile responses as if the holographic object were real.

In an IBM survey of 3,000 researchers, respondents namedholographic video calls as one of the five technologies they expect to see in place by 2015.

There are obstacles to using these technologies, of course. Not least is the cost of holographic technology: Musion currently rents its system for around $65,000 (£40,000). Live telepresence also needs a fast, direct connection of 10-20 megabytes per second (Mbps), as the recordings are of higher quality.

“But thanks to 4G and LTE networks — and 5G coming soon — bandwidth is no longer a problem,” says Huawei Enterprise’s He Liang.

“[Telepresence] will mean that we don’t have to work in an office,” he explains. “That we don’t have to travel miles to be treated by a doctor. That we can interact with our families from a different country. It will begin to change the world we live in.”

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Brilliant Use for a Bottle



Designed as a sustainable replacement for kerosene lighting in low-income areas, the Lightie consists of an LED, copper-indium-gallium-selenide solar cell, and batteries inserted into a 2 L soft drink bottle. Gizmag notes the device provides 8 hours at 120 lumens after 5-8 hours of sunlight exposure. The South African inventor expects to offer the lamp-in-a-bottle for U.S. $13

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Cracking the Code of Artificial Synthesis



Artificial photosynthesis offers great promise in producing renewable fuels without accelerating global climate change, but scientists must first develop electrocatalysts that can efficiently and economically carry out the water oxidation reaction critical to the process. Toward that goal, chemists at Lawrence Berkeley National Laboratory have uncovered two intermediate steps in water oxidation using cobalt oxide, an earth-abundant solid catalyst. Vital to this work: a spectroscopic technique known as rapid-scan Fourier transform infrared (FTIR) spectroscopy.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Optical Lab on a Chip



Optical Lab on a ChipScientists from UCLA and Switzerland's EPFL have designed a stapler-sized device that could quickly analyze up to 170,000 different molecules in a blood sample. Instead of analyzing a biosample by looking at the spectral properties of the sensing platforms, this new technique uses changes in the intensity of the light to do on-chip imaging. The method could simultaneously identify insulin levels, cancer and Alzheimer markers, and even certain viruses.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Building-integrated Photovoltaics Break New
Ground



Photovoltaics specialist Belectric OPV GmbH and construction paint and insulation system manufacturer Deutsche Amphibolin-Werke SE (DAW) are jointly developing building-integrated photovoltaic (BIPV) products incorporating organic photovoltaics (OPV). The aim of the partners is to develop finished construction elements capable of generating electricity, thereby breaking new ground in the field of BIPVs. In essence, it represents the next key step in generating solar power from buildings.

Marco La Rosa ha detto...

DA DOTT. COTELLESSA

Robotic futures: The rise of the hospital Robot



The RP-Vita is a robot which can be manoeuvred around hospital wards from afar.

It can make a diagnosis by looking into a patient's eyes and monitor breathing with a digital stethoscope. It also projects the doctor's face and voice so that a patient can interact with them.

The robot is now installed in a handful of hospitals across North America. But can a $60,000 (£36,000) a year device ever replace a doctor's bedside manner?

The BBC's North America technology correspondent Richard Taylor was given a demonstration by Colin Angle, founder of the machine's manufacturer iRobot.

«Meno recenti ‹Vecchi   201 – 400 di 3187   Nuovi› Più recenti»