1000 resultados para Matlab applications
Resumo:
El interés cada vez mayor por las redes de sensores inalámbricos pueden ser entendido simplemente pensando en lo que esencialmente son: un gran número de pequeños nodos sensores autoalimentados que recogen información o detectan eventos especiales y se comunican de manera inalámbrica, con el objetivo final de entregar sus datos procesados a una estación base. Los nodos sensores están densamente desplegados dentro del área de interés, se pueden desplegar al azar y tienen capacidad de cooperación. Por lo general, estos dispositivos son pequeños y de bajo costo, de modo que pueden ser producidos y desplegados en gran numero aunque sus recursos en términos de energía, memoria, velocidad de cálculo y ancho de banda están enormemente limitados. Detección, tratamiento y comunicación son tres elementos clave cuya combinación en un pequeño dispositivo permite lograr un gran número de aplicaciones. Las redes de sensores proporcionan oportunidades sin fin, pero al mismo tiempo plantean retos formidables, tales como lograr el máximo rendimiento de una energía que es escasa y por lo general un recurso no renovable. Sin embargo, los recientes avances en la integración a gran escala, integrado de hardware de computación, comunicaciones, y en general, la convergencia de la informática y las comunicaciones, están haciendo de esta tecnología emergente una realidad. Del mismo modo, los avances en la nanotecnología están empezando a hacer que todo gire entorno a las redes de pequeños sensores y actuadores distribuidos. Hay diferentes tipos de sensores tales como sensores de presión, acelerómetros, cámaras, sensores térmicos o un simple micrófono. Supervisan las condiciones presentes en diferentes lugares tales como la temperatura, humedad, el movimiento, la luminosidad, presión, composición del suelo, los niveles de ruido, la presencia o ausencia de ciertos tipos de objetos, los niveles de tensión mecánica sobre objetos adheridos y las características momentáneas tales como la velocidad , la dirección y el tamaño de un objeto, etc. Se comprobara el estado de las Redes Inalámbricas de Sensores y se revisaran los protocolos más famosos. Así mismo, se examinara la identificación por radiofrecuencia (RFID) ya que se está convirtiendo en algo actual y su presencia importante. La RFID tiene un papel crucial que desempeñar en el futuro en el mundo de los negocios y los individuos por igual. El impacto mundial que ha tenido la identificación sin cables está ejerciendo fuertes presiones en la tecnología RFID, los servicios de investigación y desarrollo, desarrollo de normas, el cumplimiento de la seguridad y la privacidad y muchos más. Su potencial económico se ha demostrado en algunos países mientras que otros están simplemente en etapas de planificación o en etapas piloto, pero aun tiene que afianzarse o desarrollarse a través de la modernización de los modelos de negocio y aplicaciones para poder tener un mayor impacto en la sociedad. Las posibles aplicaciones de redes de sensores son de interés para la mayoría de campos. La monitorización ambiental, la guerra, la educación infantil, la vigilancia, la micro-cirugía y la agricultura son solo unos pocos ejemplos de los muchísimos campos en los que tienen cabida las redes mencionadas anteriormente. Estados Unidos de América es probablemente el país que más ha investigado en esta área por lo que veremos muchas soluciones propuestas provenientes de ese país. Universidades como Berkeley, UCLA (Universidad de California, Los Ángeles) Harvard y empresas como Intel lideran dichas investigaciones. Pero no solo EE.UU. usa e investiga las redes de sensores inalámbricos. La Universidad de Southampton, por ejemplo, está desarrollando una tecnología para monitorear el comportamiento de los glaciares mediante redes de sensores que contribuyen a la investigación fundamental en glaciología y de las redes de sensores inalámbricos. Así mismo, Coalesenses GmbH (Alemania) y Zurich ETH están trabajando en diversas aplicaciones para redes de sensores inalámbricos en numerosas áreas. Una solución española será la elegida para ser examinada más a fondo por ser innovadora, adaptable y polivalente. Este estudio del sensor se ha centrado principalmente en aplicaciones de tráfico, pero no se puede olvidar la lista de más de 50 aplicaciones diferentes que ha sido publicada por la firma creadora de este sensor específico. En la actualidad hay muchas tecnologías de vigilancia de vehículos, incluidos los sensores de bucle, cámaras de video, sensores de imagen, sensores infrarrojos, radares de microondas, GPS, etc. El rendimiento es aceptable, pero no suficiente, debido a su limitada cobertura y caros costos de implementación y mantenimiento, especialmente este ultimo. Tienen defectos tales como: línea de visión, baja exactitud, dependen mucho del ambiente y del clima, no se puede realizar trabajos de mantenimiento sin interrumpir las mediciones, la noche puede condicionar muchos de ellos, tienen altos costos de instalación y mantenimiento, etc. Por consiguiente, en las aplicaciones reales de circulación, los datos recibidos son insuficientes o malos en términos de tiempo real debido al escaso número de detectores y su costo. Con el aumento de vehículos en las redes viales urbanas las tecnologías de detección de vehículos se enfrentan a nuevas exigencias. Las redes de sensores inalámbricos son actualmente una de las tecnologías más avanzadas y una revolución en la detección de información remota y en las aplicaciones de recogida. Las perspectivas de aplicación en el sistema inteligente de transporte son muy amplias. Con este fin se ha desarrollado un programa de localización de objetivos y recuento utilizando una red de sensores binarios. Esto permite que el sensor necesite mucha menos energía durante la transmisión de información y que los dispositivos sean más independientes con el fin de tener un mejor control de tráfico. La aplicación se centra en la eficacia de la colaboración de los sensores en el seguimiento más que en los protocolos de comunicación utilizados por los nodos sensores. Las operaciones de salida y retorno en las vacaciones son un buen ejemplo de por qué es necesario llevar la cuenta de los coches en las carreteras. Para ello se ha desarrollado una simulación en Matlab con el objetivo localizar objetivos y contarlos con una red de sensores binarios. Dicho programa se podría implementar en el sensor que Libelium, la empresa creadora del sensor que se examinara concienzudamente, ha desarrollado. Esto permitiría que el aparato necesitase mucha menos energía durante la transmisión de información y los dispositivos sean más independientes. Los prometedores resultados obtenidos indican que los sensores de proximidad binarios pueden formar la base de una arquitectura robusta para la vigilancia de áreas amplias y para el seguimiento de objetivos. Cuando el movimiento de dichos objetivos es suficientemente suave, no tiene cambios bruscos de trayectoria, el algoritmo ClusterTrack proporciona un rendimiento excelente en términos de identificación y seguimiento de trayectorias los objetos designados como blancos. Este algoritmo podría, por supuesto, ser utilizado para numerosas aplicaciones y se podría seguir esta línea de trabajo para futuras investigaciones. No es sorprendente que las redes de sensores de binarios de proximidad hayan atraído mucha atención últimamente ya que, a pesar de la información mínima de un sensor de proximidad binario proporciona, las redes de este tipo pueden realizar un seguimiento de todo tipo de objetivos con la precisión suficiente. Abstract The increasing interest in wireless sensor networks can be promptly understood simply by thinking about what they essentially are: a large number of small sensing self-powered nodes which gather information or detect special events and communicate in a wireless fashion, with the end goal of handing their processed data to a base station. The sensor nodes are densely deployed inside the phenomenon, they deploy random and have cooperative capabilities. Usually these devices are small and inexpensive, so that they can be produced and deployed in large numbers, and so their resources in terms of energy, memory, computational speed and bandwidth are severely constrained. Sensing, processing and communication are three key elements whose combination in one tiny device gives rise to a vast number of applications. Sensor networks provide endless opportunities, but at the same time pose formidable challenges, such as the fact that energy is a scarce and usually non-renewable resource. However, recent advances in low power Very Large Scale Integration, embedded computing, communication hardware, and in general, the convergence of computing and communications, are making this emerging technology a reality. Likewise, advances in nanotechnology and Micro Electro-Mechanical Systems are pushing toward networks of tiny distributed sensors and actuators. There are different sensors such as pressure, accelerometer, camera, thermal, and microphone. They monitor conditions at different locations, such as temperature, humidity, vehicular movement, lightning condition, pressure, soil makeup, noise levels, the presence or absence of certain kinds of objects, mechanical stress levels on attached objects, the current characteristics such as speed, direction and size of an object, etc. The state of Wireless Sensor Networks will be checked and the most famous protocols reviewed. As Radio Frequency Identification (RFID) is becoming extremely present and important nowadays, it will be examined as well. RFID has a crucial role to play in business and for individuals alike going forward. The impact of ‘wireless’ identification is exerting strong pressures in RFID technology and services research and development, standards development, security compliance and privacy, and many more. The economic value is proven in some countries while others are just on the verge of planning or in pilot stages, but the wider spread of usage has yet to take hold or unfold through the modernisation of business models and applications. Possible applications of sensor networks are of interest to the most diverse fields. Environmental monitoring, warfare, child education, surveillance, micro-surgery, and agriculture are only a few examples. Some real hardware applications in the United States of America will be checked as it is probably the country that has investigated most in this area. Universities like Berkeley, UCLA (University of California, Los Angeles) Harvard and enterprises such as Intel are leading those investigations. But not just USA has been using and investigating wireless sensor networks. University of Southampton e.g. is to develop technology to monitor glacier behaviour using sensor networks contributing to fundamental research in glaciology and wireless sensor networks. Coalesenses GmbH (Germany) and ETH Zurich are working in applying wireless sensor networks in many different areas too. A Spanish solution will be the one examined more thoroughly for being innovative, adaptable and multipurpose. This study of the sensor has been focused mainly to traffic applications but it cannot be forgotten the more than 50 different application compilation that has been published by this specific sensor’s firm. Currently there are many vehicle surveillance technologies including loop sensors, video cameras, image sensors, infrared sensors, microwave radar, GPS, etc. The performance is acceptable but not sufficient because of their limited coverage and expensive costs of implementation and maintenance, specially the last one. They have defects such as: line-ofsight, low exactness, depending on environment and weather, cannot perform no-stop work whether daytime or night, high costs for installation and maintenance, etc. Consequently, in actual traffic applications the received data is insufficient or bad in terms of real-time owed to detector quantity and cost. With the increase of vehicle in urban road networks, the vehicle detection technologies are confronted with new requirements. Wireless sensor network is the state of the art technology and a revolution in remote information sensing and collection applications. It has broad prospect of application in intelligent transportation system. An application for target tracking and counting using a network of binary sensors has been developed. This would allow the appliance to spend much less energy when transmitting information and to make more independent devices in order to have a better traffic control. The application is focused on the efficacy of collaborative tracking rather than on the communication protocols used by the sensor nodes. Holiday crowds are a good case in which it is necessary to keep count of the cars on the roads. To this end a Matlab simulation has been produced for target tracking and counting using a network of binary sensors that e.g. could be implemented in Libelium’s solution. Libelium is the enterprise that has developed the sensor that will be deeply examined. This would allow the appliance to spend much less energy when transmitting information and to make more independent devices. The promising results obtained indicate that binary proximity sensors can form the basis for a robust architecture for wide area surveillance and tracking. When the target paths are smooth enough ClusterTrack particle filter algorithm gives excellent performance in terms of identifying and tracking different target trajectories. This algorithm could, of course, be used for different applications and that could be done in future researches. It is not surprising that binary proximity sensor networks have attracted a lot of attention lately. Despite the minimal information a binary proximity sensor provides, networks of these sensing modalities can track all kinds of different targets classes accurate enough.
Resumo:
La relación entre la ingeniería y la medicina cada vez se está haciendo más estrecha, y debido a esto se ha creado una nueva disciplina, la bioingeniería, ámbito en el que se centra el proyecto. Este ámbito cobra gran interés debido al rápido desarrollo de nuevas tecnologías que en particular permiten, facilitan y mejoran la obtención de diagnósticos médicos respecto de los métodos tradicionales. Dentro de la bioingeniería, el campo que está teniendo mayor desarrollo es el de la imagen médica, gracias al cual se pueden obtener imágenes del interior del cuerpo humano con métodos no invasivos y sin necesidad de recurrir a la cirugía. Mediante métodos como la resonancia magnética, rayos X, medicina nuclear o ultrasonidos, se pueden obtener imágenes del cuerpo humano para realizar diagnósticos. Para que esas imágenes puedan ser utilizadas con ese fin hay que realizar un correcto tratamiento de éstas mediante técnicas de procesado digital. En ése ámbito del procesado digital de las imágenes médicas es en el que se ha realizado este proyecto. Gracias al desarrollo del tratamiento digital de imágenes con métodos de extracción de información, mejora de la visualización o resaltado de rasgos de interés de las imágenes, se puede facilitar y mejorar el diagnóstico de los especialistas. Por todo esto en una época en la que se quieren automatizar todos los procesos para mejorar la eficacia del trabajo realizado, el automatizar el procesado de las imágenes para extraer información con mayor facilidad, es muy útil. Actualmente una de las herramientas más potentes en el tratamiento de imágenes médicas es Matlab, gracias a su toolbox de procesado de imágenes. Por ello se eligió este software para el desarrollo de la parte práctica de este proyecto, su potencia y versatilidad simplifican la implementación de algoritmos. Este proyecto se estructura en dos partes. En la primera se realiza una descripción general de las diferentes modalidades de obtención de imágenes médicas y se explican los diferentes usos de cada método, dependiendo del campo de aplicación. Posteriormente se hace una descripción de las técnicas más importantes de procesado de imagen digital que han sido utilizadas en el proyecto. En la segunda parte se desarrollan cuatro aplicaciones en Matlab para ejemplificar el desarrollo de algoritmos de procesado de imágenes médicas. Dichas implementaciones demuestran la aplicación y utilidad de los conceptos explicados anteriormente en la parte teórica, como la segmentación y operaciones de filtrado espacial de la imagen, así como otros conceptos específicos. Las aplicaciones ejemplo desarrolladas han sido: obtención del porcentaje de metástasis de un tejido, diagnóstico de las deformidades de la columna vertebral, obtención de la MTF de una cámara de rayos gamma y medida del área de un fibroadenoma de una ecografía de mama. Por último, para cada una de las aplicaciones se detallará su utilidad en el campo de la imagen médica, los resultados obtenidos y su implementación en una interfaz gráfica para facilitar su uso. ABSTRACT. The relationship between medicine and engineering is becoming closer than ever giving birth to a recently appeared science field: bioengineering. This project is focused on this subject. This recent field is becoming more and more important due to the fast development of new technologies that provide tools to improve disease diagnosis, with regard to traditional procedures. In bioengineering the fastest growing field is medical imaging, in which we can obtain images of the inside of the human body without need of surgery. Nowadays by means of the medical modalities of magnetic resonance, X ray, nuclear medicine or ultrasound, we can obtain images to make a more accurate diagnosis. For those images to be useful within the medical field, they should be processed properly with some digital image processing techniques. It is in this field of digital medical image processing where this project is developed. Thanks to the development of digital image processing providing methods for data collection, improved visualization or data highlighting, diagnosis can be eased and facilitated. In an age where automation of processes is much sought, automated digital image processing to ease data collection is extremely useful. One of the most powerful image processing tools is Matlab, together with its image processing toolbox. That is the reason why that software was chosen to develop the practical algorithms in this project. This final project is divided into two main parts. Firstly, the different modalities for obtaining medical images will be described. The different usages of each method according to the application will also be specified. Afterwards we will give a brief description of the most important image processing tools that have been used in the project. Secondly, four algorithms in Matlab are implemented, to provide practical examples of medical image processing algorithms. This implementation shows the usefulness of the concepts previously explained in the first part, such as: segmentation or spatial filtering. The particular applications examples that have been developed are: calculation of the metastasis percentage of a tissue, diagnosis of spinal deformity, approximation to the MTF of a gamma camera, and measurement of the area of a fibroadenoma in an ultrasound image. Finally, for each of the applications developed, we will detail its usefulness within the medical field, the results obtained, and its implementation in a graphical user interface to ensure ease of use.
Resumo:
Nowadays, a lot of applications use digital images. For example in face recognition to detect and tag persons in photograph, for security control, and a lot of applications that can be found in smart cities, as speed control in roads or highways and cameras in traffic lights to detect drivers ignoring red light. Also in medicine digital images are used, such as x-ray, scanners, etc. These applications depend on the quality of the image obtained. A good camera is expensive, and the image obtained depends also on external factor as light. To make these applications work properly, image enhancement is as important as, for example, a good face detection algorithm. Image enhancement also can be used in normal photograph, for pictures done in bad light conditions, or just to improve the contrast of an image. There are some applications for smartphones that allow users apply filters or change the bright, colour or contrast on the pictures. This project compares four different techniques to use in image enhancement. After applying one of these techniques to an image, it will use better the whole available dynamic range. Some of the algorithms are designed for grey scale images and others for colour images. It is used Matlab software to develop and present the final results. These algorithms are Successive Means Quantization Transform (SMQT), Histogram Equalization, using Matlab function and own implemented function, and V transform. Finally, as conclusions, we can prove that Histogram equalization algorithm is the simplest of all, it has a wide variability of grey levels and it is not suitable for colour images. V transform algorithm is a good option for colour images. The algorithm is linear and requires low computational power. SMQT algorithm is non-linear, insensitive to gain and bias and it can extract structure of the data. RESUMEN. Hoy en día incontable número de aplicaciones usan imágenes digitales. Por ejemplo, para el control de la seguridad se usa el reconocimiento de rostros para detectar y etiquetar personas en fotografías o vídeos, para distintos usos de las ciudades inteligentes, como control de velocidad en carreteras o autopistas, cámaras en los semáforos para detectar a conductores haciendo caso omiso de un semáforo en rojo, etc. También en la medicina se utilizan imágenes digitales, como por ejemplo, rayos X, escáneres, etc. Todas estas aplicaciones dependen de la calidad de la imagen obtenida. Una buena cámara es cara, y la imagen obtenida depende también de factores externos como la luz. Para hacer que estas aplicaciones funciones correctamente, el tratamiento de imagen es tan importante como, por ejemplo, un buen algoritmo de detección de rostros. La mejora de la imagen también se puede utilizar en la fotografía no profesional o de consumo, para las fotos realizadas en malas condiciones de luz, o simplemente para mejorar el contraste de una imagen. Existen aplicaciones para teléfonos móviles que permiten a los usuarios aplicar filtros y cambiar el brillo, el color o el contraste en las imágenes. Este proyecto compara cuatro técnicas diferentes para utilizar el tratamiento de imagen. Se utiliza la herramienta de software matemático Matlab para desarrollar y presentar los resultados finales. Estos algoritmos son Successive Means Quantization Transform (SMQT), Ecualización del histograma, usando la propia función de Matlab y una nueva función que se desarrolla en este proyecto y, por último, una función de transformada V. Finalmente, como conclusión, podemos comprobar que el algoritmo de Ecualización del histograma es el más simple de todos, tiene una amplia variabilidad de niveles de gris y no es adecuado para imágenes en color. El algoritmo de transformada V es una buena opción para imágenes en color, es lineal y requiere baja potencia de cálculo. El algoritmo SMQT no es lineal, insensible a la ganancia y polarización y, gracias a él, se puede extraer la estructura de los datos.
Resumo:
Magnetic fluid hyperthermia (MFH) is considered a promising therapeutic technique for the treatment of cancer cells, in which magnetic nanoparticles (MNPs) with superparamagnetic behavior generate mild-temperatures under an AC magnetic field to selectively destroy the abnormal cancer cells, in detriment of the healthy ones. However, the poor heating efficiency of most NMPs and the imprecise experimental determination of the temperature field during the treatment, are two of the majors drawbacks for its clinical advance. Thus, in this work, different MNPs were developed and tested under an AC magnetic field (~1.10 kA/m and 200 kHz), and the heat generated by them was assessed by an infrared camera. The resulting thermal images were processed in MATLAB after the thermographic calibration of the infrared camera. The results show the potential to use this thermal technique for the improvement and advance of MFH as a clinical therapy.
Resumo:
In the world, scientific studies increase day by day and computer programs facilitate the human’s life. Scientists examine the human’s brain’s neural structure and they try to be model in the computer and they give the name of artificial neural network. For this reason, they think to develop more complex problem’s solution. The purpose of this study is to estimate fuel economy of an automobile engine by using artificial neural network (ANN) algorithm. Engine characteristics were simulated by using “Neuro Solution” software. The same data is used in MATLAB to compare the performance of MATLAB is such a problem and show its validity. The cylinder, displacement, power, weight, acceleration and vehicle production year are used as input data and miles per gallon (MPG) are used as target data. An Artificial Neural Network model was developed and 70% of data were used as training data, 15% of data were used as testing data and 15% of data is used as validation data. In creating our model, proper neuron number is carefully selected to increase the speed of the network. Since the problem has a nonlinear structure, multi layer are used in our model.
Resumo:
Electric vehicles (EVs) and hybrid electric vehicles (HEVs) can reduce greenhouse gas emissions while switched reluctance motor (SRM) is one of the promising motor for such applications. This paper presents a novel SRM fault-diagnosis and fault-tolerance operation solution. Based on the traditional asymmetric half-bridge topology for the SRM driving, the central tapped winding of the SRM in modular half-bridge configuration are introduced to provide fault-diagnosis and fault-tolerance functions, which are set idle in normal conditions. The fault diagnosis can be achieved by detecting the characteristic of the excitation and demagnetization currents. An SRM fault-tolerance operation strategy is also realized by the proposed topology, which compensates for the missing phase torque under the open-circuit fault, and reduces the unbalanced phase current under the short-circuit fault due to the uncontrolled faulty phase. Furthermore, the current sensor placement strategy is also discussed to give two placement methods for low cost or modular structure. Simulation results in MATLAB/Simulink and experiments on a 750-W SRM validate the effectiveness of the proposed strategy, which may have significant implications and improve the reliability of EVs/HEVs.
Resumo:
Electric vehicles (EVs) and hybrid EVs are the way forward for green transportation and for establishing low-carbon economy. This paper presents a split converter-fed four-phase switched reluctance motor (SRM) drive to realize flexible integrated charging functions (dc and ac sources). The machine is featured with a central-tapped winding node, eight stator slots, and six rotor poles (8/6). In the driving mode, the developed topology has the same characteristics as the traditional asymmetric bridge topology but better fault tolerance. The proposed system supports battery energy balance and on-board dc and ac charging. When connecting with an ac power grid, the proposed topology has a merit of the multilevel converter; the charging current control can be achieved by the improved hysteresis control. The energy flow between the two batteries is balanced by the hysteresis control based on their state-of-charge conditions. Simulation results in MATLAB/Simulink and experiments on a 150-W prototype SRM validate the effectiveness of the proposed technologies, which may provide a solution to EV charging issues associated with significant infrastructure requirements.
Resumo:
The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.
Resumo:
Highlights of Data Expedition: • Students explored daily observations of local climate data spanning the past 35 years. • Topological Data Analysis, or TDA for short, provides cutting-edge tools for studying the geometry of data in arbitrarily high dimensions. • Using TDA tools, students discovered intrinsic dynamical features of the data and learned how to quantify periodic phenomenon in a time-series. • Since nature invariably produces noisy data which rarely has exact periodicity, students also considered the theoretical basis of almost-periodicity and even invented and tested new mathematical definitions of almost-periodic functions. Summary The dataset we used for this data expedition comes from the Global Historical Climatology Network. “GHCN (Global Historical Climatology Network)-Daily is an integrated database of daily climate summaries from land surface stations across the globe.” Source: https://www.ncdc.noaa.gov/oa/climate/ghcn-daily/ We focused on the daily maximum and minimum temperatures from January 1, 1980 to April 1, 2015 collected from RDU International Airport. Through a guided series of exercises designed to be performed in Matlab, students explore these time-series, initially by direct visualization and basic statistical techniques. Then students are guided through a special sliding-window construction which transforms a time-series into a high-dimensional geometric curve. These high-dimensional curves can be visualized by projecting down to lower dimensions as in the figure below (Figure 1), however, our focus here was to use persistent homology to directly study the high-dimensional embedding. The shape of these curves has meaningful information but how one describes the “shape” of data depends on which scale the data is being considered. However, choosing the appropriate scale is rarely an obvious choice. Persistent homology overcomes this obstacle by allowing us to quantitatively study geometric features of the data across multiple-scales. Through this data expedition, students are introduced to numerically computing persistent homology using the rips collapse algorithm and interpreting the results. In the specific context of sliding-window constructions, 1-dimensional persistent homology can reveal the nature of periodic structure in the original data. I created a special technique to study how these high-dimensional sliding-window curves form loops in order to quantify the periodicity. Students are guided through this construction and learn how to visualize and interpret this information. Climate data is extremely complex (as anyone who has suffered from a bad weather prediction can attest) and numerous variables play a role in determining our daily weather and temperatures. This complexity coupled with imperfections of measuring devices results in very noisy data. This causes the annual seasonal periodicity to be far from exact. To this end, I have students explore existing theoretical notions of almost-periodicity and test it on the data. They find that some existing definitions are also inadequate in this context. Hence I challenged them to invent new mathematics by proposing and testing their own definition. These students rose to the challenge and suggested a number of creative definitions. While autocorrelation and spectral methods based on Fourier analysis are often used to explore periodicity, the construction here provides an alternative paradigm to quantify periodic structure in almost-periodic signals using tools from topological data analysis.
Resumo:
The evolution of cellular systems towards third generation (3G) or IMT-2000 seems to have a tendency to use W-CDMA as the standard access method, as ETSI decisions have showed. However, there is a question about the improvements in capacity and the wellness of this access method. One of the aspects that worry developers and researchers planning the third generation is the extended use of the Internet and more and more bandwidth hungry applications. This work shows the performance of a W-CDMA system simulated in a PC using cover maps generated with DC-Cell, a GIS based planning tool developed by the Technical University of Valencia, Spain. The maps are exported to MATLAB and used in the model. The system used consists of several microcells in a downtown area. We analyse the interference from users in the same cell and in adjacent cells and the effect in the system, assuming perfect control for each cell. The traffic generated by the simulator is voice and data. This model allows us to work with coverage that is more accurate and is a good approach to analyse the multiple access interference (MAI) problem in microcellular systems with irregular coverage. Finally, we compare the results obtained, with the performance of a similar system using TDMA.
Resumo:
Irradiation is the main component for producing the electricity from solar energy. When obstacles come in between the sun and the PV cell then it doesn’t get sufficient irradiance to produce enough electricity. Shadowing has a great impact on photovoltaic cell. The main fuel of PV cell is solar radiation. Using solar radiation, a photovoltaic cell produces electricity. The shadow on a PV cell decreases the output of the photovoltaic cell. It has been already shown in different papers that shadow effect decreases the output of the PV cell. There are different kinds of shadow effects which are observed, some minimize the PV cell output and some reduce the output to zero. There are different types of shadow based on their effects on the photovoltaic cell. The shadow has also effects depending on whether the PV cells are connected in series connection or in parallel connection. In series when one cell is out of order then the whole series of the PV cells will not work but in parallel connection if one cell is damaged, the others will work because they work independently. According to the output requirement the arrangement of the PV cells are made in series or parallel. Simulink modeling is made for series and parallel connection between two PV cells and the shadow effect is analyzed on one of the PV cells. Using SIMULINK, the shadowing is simulated on the two PV cells, where in one system they are in series and in another system they are in parallel. Slowly the irradiance is decreased to simulate the shadow effect. Simulation of the shadow effect gives an idea about the output of the PV cell system when system has shadow on the PV cells. Here the shadow effect on the two PV cells using series and parallel combinations are simulated and analyzed for understanding the effects on output.
Resumo:
Rail transportation has significant importance in the future world. This importance is tightly bounded to accessible, sustainable, efficient and safe railway systems. Precise positioning in railway applications is essential for increasing railway traffic, train-track control, collision avoidance, train management and autonomous train driving. Hence, precise train positioning is a safety-critical application. Nowadays, positioning in railway applications highly depends on a cellular-based system called GSM-R, a railway-specific version of Global System for Mobile Communications (GSM). However, GSM-R is a relatively outdated technology and does not provide enough capacity and precision demanded by future railway networks. One option for positioning is mounting Global Navigation Satellite System (GNSS) receivers on trains as a low-cost solution. Nevertheless, GNSS can not provide continuous service due to signal interruption by harsh environments, tunnels etc. Another option is exploiting cellular-based positioning methods. The most recent cellular technology, 5G, provides high network capacity, low latency, high accuracy and high availability suitable for train positioning. In this thesis, an approach to 5G-based positioning for railway systems is discussed and simulated. Observed Time Difference of Arrival (OTDOA) method and 5G Positioning Reference Signal (PRS) are used. Simulations run using MATLAB, based on existing code developed for 5G positioning by extending it for Non Line of Sight (NLOS) link detection and base station exclusion algorithms. Performance analysis for different configurations is completed. Results show that efficient NLOS detection improves positioning accuracy and implementing a base station exclusion algorithm helps for further increase.
Resumo:
In this thesis, the study and the simulation of two advanced sensorless speed control techniques for a surface PMSM are presented. The aim is to implement a sensorless control algorithm for a submarine auxiliary propulsion system. This experimental activity is the result of a project collaboration with L3Harris Calzoni, a leader company in A&D systems for naval handling in military field. A Simulink model of the whole electric drive has been developed. Due to the satisfactory results of the simulations, the sensorless control system has been implemented in C code for STM32 environment. Finally, several tests on a real brushless machine have been carried out while the motor was connected to a mechanical load to simulate the real scenario of the final application. All the experimental results have been recorded through a graphical interface software developed at Calzoni.
Resumo:
Protocols for the generation of dendritic cells (DCs) using serum as a supplementation of culture media leads to reactions due to animal proteins and disease transmissions. Several types of serum-free media (SFM), based on good manufacture practices (GMP), have recently been used and seem to be a viable option. The aim of this study was to evaluate the results of the differentiation, maturation, and function of DCs from Acute Myeloid Leukemia patients (AML), generated in SFM and medium supplemented with autologous serum (AS). DCs were analyzed by phenotype characteristics, viability, and functionality. The results showed the possibility of generating viable DCs in all the conditions tested. In patients, the X-VIVO 15 medium was more efficient than the other media tested in the generation of DCs producing IL-12p70 (p=0.05). Moreover, the presence of AS led to a significant increase of IL-10 by DCs as compared with CellGro (p=0.05) and X-Vivo15 (p=0.05) media, both in patients and donors. We concluded that SFM was efficient in the production of DCs for immunotherapy in AML patients. However, the use of AS appears to interfere with the functional capacity of the generated DCs.
Resumo:
The goal of this cross-sectional observational study was to quantify the pattern-shift visual evoked potentials (VEP) and the thickness as well as the volume of retinal layers using optical coherence tomography (OCT) across a cohort of Parkinson's disease (PD) patients and age-matched controls. Forty-three PD patients and 38 controls were enrolled. All participants underwent a detailed neurological and ophthalmologic evaluation. Idiopathic PD cases were included. Cases with glaucoma or increased intra-ocular pressure were excluded. Patients were assessed by VEP and high-resolution Fourier-domain OCT, which quantified the inner and outer thicknesses of the retinal layers. VEP latencies and the thicknesses of the retinal layers were the main outcome measures. The mean age, with standard deviation (SD), of the PD patients and controls were 63.1 (7.5) and 62.4 (7.2) years, respectively. The patients were predominantly in the initial Hoehn-Yahr (HY) disease stages (34.8% in stage 1 or 1.5, and 55.8 % in stage 2). The VEP latencies and the thicknesses as well as the volumes of the retinal inner and outer layers of the groups were similar. A negative correlation between the retinal thickness and the age was noted in both groups. The thickness of the retinal nerve fibre layer (RNFL) was 102.7 μm in PD patients vs. 104.2 μm in controls. The thicknesses of retinal layers, VEP, and RNFL of PD patients were similar to those of the controls. Despite the use of a representative cohort of PD patients and high-resolution OCT in this study, further studies are required to establish the validity of using OCT and VEP measurements as the anatomic and functional biomarkers for the evaluation of retinal and visual pathways in PD patients.