854 resultados para based inspection and conditional monitoring


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main aim of the research project "On the Contribution of Schools to Children's Overall Indoor Air Exposure" is to study associations between adverse health effects, namely, allergy, asthma, and respiratory symptoms, and indoor air pollutants to which children are exposed to in primary schools and homes. Specifically, this investigation reports on the design of the study and methods used for data collection within the research project and discusses factors that need to be considered when designing such a study. Further, preliminary findings concerning descriptors of selected characteristics in schools and homes, the study population, and clinical examination are presented. The research project was designed in two phases. In the first phase, 20 public primary schools were selected and a detailed inspection and indoor air quality (IAQ) measurements including volatile organic compounds (VOC), aldehydes, particulate matter (PM2.5, PM10), carbon dioxide (CO2), carbon monoxide (CO), bacteria, fungi, temperature, and relative humidity were conducted. A questionnaire survey of 1600 children of ages 8-9 years was undertaken and a lung function test, exhaled nitric oxide (eNO), and tear film stability testing were performed. The questionnaire focused on children's health and on the environment in their school and homes. One thousand and ninety-nine questionnaires were returned. In the second phase, a subsample of 68 children was enrolled for further studies, including a walk-through inspection and checklist and an extensive set of IAQ measurements in their homes. The acquired data are relevant to assess children's environmental exposures and health status.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Humans have a high ability to extract visual data information acquired by sight. Trought a learning process, which starts at birth and continues throughout life, image interpretation becomes almost instinctively. At a glance, one can easily describe a scene with reasonable precision, naming its main components. Usually, this is done by extracting low-level features such as edges, shapes and textures, and associanting them to high level meanings. In this way, a semantic description of the scene is done. An example of this, is the human capacity to recognize and describe other people physical and behavioral characteristics, or biometrics. Soft-biometrics also represents inherent characteristics of human body and behaviour, but do not allow unique person identification. Computer vision area aims to develop methods capable of performing visual interpretation with performance similar to humans. This thesis aims to propose computer vison methods which allows high level information extraction from images in the form of soft biometrics. This problem is approached in two ways, unsupervised and supervised learning methods. The first seeks to group images via an automatic feature extraction learning , using both convolution techniques, evolutionary computing and clustering. In this approach employed images contains faces and people. Second approach employs convolutional neural networks, which have the ability to operate on raw images, learning both feature extraction and classification processes. Here, images are classified according to gender and clothes, divided into upper and lower parts of human body. First approach, when tested with different image datasets obtained an accuracy of approximately 80% for faces and non-faces and 70% for people and non-person. The second tested using images and videos, obtained an accuracy of about 70% for gender, 80% to the upper clothes and 90% to lower clothes. The results of these case studies, show that proposed methods are promising, allowing the realization of automatic high level information image annotation. This opens possibilities for development of applications in diverse areas such as content-based image and video search and automatica video survaillance, reducing human effort in the task of manual annotation and monitoring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Health monitoring has become widespread these past few years. Such applications include from exercise, food intake and weight watching, to specific scenarios like monitoring people who suffer from chronic diseases. More and more we see the need to also monitor the health of new-born babies and even fetuses. Congenital Heart Defects (CHDs) are the main cause of deaths among babies and doctors do not know most of these defects. Hence, there is a need to study what causes these anomalies, and by monitoring the fetus daily there will be a better chance of identifying the defects in earlier stages. By analyzing the data collected, doctors can find patterns and come up with solutions, thus saving peoples’ lives. In many countries, the most common fetal monitor is the ultrasound and the use of it is regulated. In Sweden for normal pregnancies, there is only one ultrasound scan during the pregnancy period. There is no great evidence that ultrasound can harm the fetus, but many doctors suggest to use it as little as possible. Therefore, there is a demand for a new non-ultrasound device that can be as accurate, or even better, on detecting the FHR and not harming the baby. The problems that are discussed in this thesis include how can accurate fetus health be monitored non-invasively at home and how could a fetus health monitoring system for home use be designed. The first part of the research investigates different technologies that are currently being used on fetal monitoring, and techniques and parameters to monitor the fetus. The second part is a qualitative study held in Sweden between April and May 2016. The data for the qualitative study was collected through interviews with 21 people, 10 mothers/mothers-to-be and 11 obstetricians/gynecologists/midwives. The questions were related to the Swedish pregnancy protocol, the use of technology in medicine and in particular during the pregnancy process, and the use of an ECG based monitoring device. The results show that there is still room for improvements on the algorithms to extract the fetal ECG and the survey was very helpful in understanding the need for a fetal home monitor. Parents are open to new technologies especially if it doesn't affect the baby's growth. Doctors are open to use ECG as a great alternative to ultrasound; on the other hand, midwives are happy with the current system. The remote monitoring feature is very desirable to everyone, if such system will be used in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The Medical Education Partnership Initiative, has helped to mitigate the digital divide in Africa. The aim of the study was to assess the level of access, attitude, and training concerning meaningful use of electronic resources and EBM among medical students at an African medical school. Methods: The study involved medical students at the University of Zimbabwe College of Health Sciences, Harare. The needs assessment tool consisted of a 21-question, paper-based, voluntary and anonymous survey. Results: A total of 61/67 (91%), responded to the survey. 60% of the medical students were ‘third-year medical students’. Among medical students, 85% of responders had access to digital medical resources, but 54% still preferred printed medical textbooks. Although 25% of responders had received training in EBM, but only 7% found it adequate. 98% of the participants did not receive formal training in journal club presentation or analytical reading of medical literature, but 77 % of them showed interest in learning these skills. Conclusion: Lack of training in EBM, journal club presentation and analytical reading skills have limited the impact of upgraded technology in enhancing the level of knowledge. This impact can be boosted by developing a curriculum with skills necessary in using EBM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new ‘Danger Theory’ (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of ‘grounding’ the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article we are willing to demonstrate why and how monitoring is so important to make improvements and so as examples we will use education and health in Portugal and its recent achievements. Without knowing where we are we will never get to know where to go to. That is the reason why monitoring is mandatory to delineate planning and so PDCA cycle is of so much importance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The project goal was to determine plant operations and maintenance worker’s level of exposure to mercury during routine and non-routine (i.e. turnarounds and inspections) maintenance events in eight gas processing plants. The project team prepared sampling and analysis plans designed to each plant’s process design and scheduled maintenance events. Occupational exposure sampling and monitoring efforts were focused on the measurement of mercury vapor concentration in worker breathing zone air during specific maintenance events including: pipe scrapping, process filter replacement, and process vessel inspection. Similar exposure groups were identified and worker breathing zone and ambient air samples were collected and analyzed for total mercury. Occupational exposure measurement techniques included portable field monitoring instruments, standard passive and active monitoring methods and an emerging passive absorption technology. Process sampling campaigns were focused on inlet gas streams, mercury removal unit outlets, treated gas, acid gas and sales gas. The results were used to identify process areas with increased potential for mercury exposure during maintenance events. Sampling methods used for the determination of total mercury in gas phase streams were based on the USEPA Methods 30B and EPA 1631 and EPA 1669. The results of four six-week long sampling campaigns have been evaluated and some conclusions and recommendations have been made. The author’s role in this project included the direction of all field phases of the project and the development and implementation of the sampling strategy. Additionally, the author participated in the development and implementation of the Quality Assurance Project Plan, Data Quality Objectives, and Similar Exposure Groups identification. All field generated data was reviewed by the author along with laboratory reports in order to generate conclusions and recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By employing interpretive policy analysis this thesis aims to assess, measure, and explain policy capacity for government and non-government organizations involved in reclaiming Alberta's oil sands. Using this type of analysis to assess policy capacity is a novel approach for understanding reclamation policy; and therefore, this research will provide a unique contribution to the literature surrounding reclamation policy. The oil sands region in northeast Alberta, Canada is an area of interest for a few reasons; primarily because of the vast reserves of bitumen and the environmental cost associated with developing this resource. An increase in global oil demand has established incentive for industry to seek out and develop new reserves. Alberta's oil sands are one of the largest remaining reserves in the world, and there is significant interest in increasing production in this region. Furthermore, tensions in several oil exporting nations in the Middle East remain unresolved, and this has garnered additional support for a supply side solution to North American oil demands. This solution relies upon the development of reserves in both the United States and Canada. These compounding factors have contributed to the increased development in the oil sands of northeastern Alberta. Essentially, a rapid expansion of oil sands operations is ongoing, and is the source of significant disturbance across the region. This disturbance, and the promises of reclamation, is a source of contentious debates amongst stakeholders and continues to be highly visible in the media. If oil sands operations are to retain their social license to operate, it is critical that reclamation efforts be effective. One concern non-governmental organizations (NGOs) expressed criticizes the current monitoring and enforcement of regulatory programs in the oil sands. Alberta's NGOs have suggested the data made available to them originates from industrial sources, and is generally unchecked by government. In an effort to discern the overall status of reclamation in the oil sands this study explores several factors essential to policy capacity: work environment, training, employee attitudes, perceived capacity, policy tools, evidence based work, and networking. Data was collected through key informant interviews with senior policy professionals in government and non-government agencies in Alberta. The following are agencies of interest in this research: Canadian Association of Petroleum Producers (CAPP); Alberta Environment and Sustainable Resource Development (AESRD); Alberta Energy Regulator (AER); Cumulative Environmental Management Association (CEMA); Alberta Environment Monitoring, Evaluation, and Reporting Agency (AEMERA); Wood Buffalo Environmental Association (WBEA). The aim of this research is to explain how and why reclamation policy is conducted in Alberta's oil sands. This will illuminate government capacity, NGO capacity, and the interaction of these two agency typologies. In addition to answering research questions, another goal of this project is to show interpretive analysis of policy capacity can be used to measure and predict policy effectiveness. The oil sands of Alberta will be the focus of this project, however, future projects could focus on any government policy scenario utilizing evidence-based approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intracochlear trauma from surgical insertion of bulky electrode arrays and inadequate pitch perception are areas of concern with current hand-assembled commercial cochlear implants. Parylene thin-film arrays with higher electrode densities and lower profiles are a potential solution, but lack rigidity and hence depend on manually fabricated permanently attached polyethylene terephthalate (PET) tubing based bulky backing devices. As a solution, we investigated a new backing device with two sub-systems. The first sub-system is a thin poly(lactic acid) (PLA) stiffener that will be embedded in the parylene array. The second sub-system is an attaching and detaching mechanism, utilizing a poly(N-vinylpyrrolidone)-block-poly(d,l-lactide) (PVP-b-PDLLA) copolymer-based biodegradable and water soluble adhesive, that will help to retract the PET insertion tool after implantation. As a proof-of-concept of sub-system one, a microfabrication process for patterning PLA stiffeners embedded in parylene has been developed. Conventional hotembossing, mechanical micromachining, and standard cleanroom processes were integrated for patterning fully released and discrete stiffeners coated with parylene. The released embedded stiffeners were thermoformed to demonstrate that imparting perimodiolar shapes to stiffener-embedded arrays will be possible. The developed process when integrated with the array fabrication process will allow fabrication of stiffener-embedded arrays in a single process. As a proof-of-concept of sub-system two, the feasibility of the attaching and detaching mechanism was demonstrated by adhering 1x and 1.5x scale PET tube-based insertion tools and PLA stiffeners embedded in parylene using the copolymer adhesive. The attached devices survived qualitative adhesion tests, thermoforming, and flexing. The viability of the detaching mechanism was tested by aging the assemblies in-vitro in phosphate buffer solution. The average detachment times, 2.6 minutes and 10 minutes for 1x and 1.5x scale devices respectively, were found to be clinically relevant with respect to the reported array insertion times during surgical implantation. Eventually, the stiffener-embedded arrays would not need to be permanently attached to current insertion tools which are left behind after implantation and congest the cochlear scala tympani chamber. Finally, a simulation-based approach for accelerated failure analysis of PLA stiffeners and characterization of PVP-b-PDLLA copolymer adhesive has been explored. The residual functional life of embedded PLA stiffeners exposed to body-fluid and thereby subjected to degradation and erosion has been estimated by simulating PLA stiffeners with different parylene coating failure types and different PLA types for a given parylene coating failure type. For characterizing the PVP-b-PDLLA copolymer adhesive, several formulations of the copolymer adhesive were simulated and compared based on the insertion tool detachment times that were predicted from the dissolution, degradation, and erosion behavior of the simulated adhesive formulations. Results indicate that the simulation-based approaches could be used to reduce the total number of time consuming and expensive in-vitro tests that must be conducted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system’s EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter’s components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Axle bearing damage with possible catastrophic failures can cause severe disruptions or even dangerous derailments, potentially causing loss of human life and leading to significant costs for railway infrastructure managers and rolling stock operators. Consequently the axle bearing damage process has safety and economic implications on the exploitation of railways systems. Therefore it has been the object of intense attention by railway authorities as proved by the selection of this topic by the European Commission in calls for research proposals. The MAXBE Project (http://www.maxbeproject.eu/), an EU-funded project, appears in this context and its main goal is to develop and to demonstrate innovative and efficient technologies which can be used for the onboard and wayside condition monitoring of axle bearings. The MAXBE (interoperable monitoring, diagnosis and maintenance strategies for axle bearings) project focuses on detecting axle bearing failure modes at an early stage by combining new and existing monitoring techniques and on characterizing the axle bearing degradation process. The consortium for the MAXBE project comprises 18 partners from 8 member states, representing operators, railway administrations, axle bearing manufactures, key players in the railway community and experts in the field of monitoring, maintenance and rolling stock. The University of Porto is coordinating this research project that kicked-off in November 2012 and it is completed on October 2015. Both on-board and wayside systems are explored in the project since there is a need for defining the requirement for the onboard equipment and the range of working temperatures of the axle bearing for the wayside systems. The developed monitoring systems consider strain gauges, high frequency accelerometers, temperature sensors and acoustic emission. To get a robust technology to support the decision making of the responsible stakeholders synchronized measurements from onboard and wayside monitoring systems are integrated into a platform. Also extensive laboratory tests were performed to correlate the in situ measurements to the status of the axle bearing life. With the MAXBE project concept it will be possible: to contribute to detect at an early stage axle bearing failures; to create conditions for the operational and technical integration of axle bearing monitoring and maintenance in different European railway networks; to contribute to the standardization of the requirements for the axle bearing monitoring, diagnosis and maintenance. Demonstration of the developed condition monitoring systems was performed in Portugal in the Northern Railway Line with freight and passenger traffic with a maximum speed of 220 km/h, in Belgium in a tram line and in the UK. Still within the project, a tool for optimal maintenance scheduling and a smart diagnostic tool were developed. This paper presents a synthesis of the most relevant results attained in the project. The successful of the project and the developed solutions have positive impact on the reliability, availability, maintainability and safety of rolling stock and infrastructure with main focus on the axle bearing health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: This study aimed to describe the developmental trajectories of registered nurses' capability beliefs during their first 3 years of practice. The focus was on three core competencies for health professionals-patient-centered care, teamwork, and evidence-based practice. Methods: A national cohort of registered nurses (n = 1,205) was recruited during their nursing education and subsequently surveyed yearly during the first 3 years of working life. The survey included 16 items on capability beliefs divided into three subscales for the assessment of patient-centered care, teamwork, and evidence-based practice, and the data were analyzed with linear latent growth modeling. Results: The nurses' capability beliefs for patient-centered care increased over the three first years of working life, their capability beliefs for evidence-based practice were stable over the 3 years, and their capability beliefs for teamwork showed a downward trend. Linking evidence to action: Through collaboration between nursing education and clinical practice, the transition to work life could be supported and competence development in newly graduated nurses could be enhanced to help them master the core competencies. Future research should focus on determining which factors impact the development of capability beliefs in new nurses and how these factors can be developed by testing interventions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Europe, the concerns with the status of marine ecosystems have increased, and the Marine Directive has as main goal the achievement of Good Environmental Status (GES) of EU marine waters by 2020. Molecular tools are seen as promising and emerging approaches to improve ecosystem monitoring, and have led ecology into a new era, representing perhaps the most source of innovation in marine monitoring techniques. Benthic nematodes are considered ideal organisms to be used as biological indicator of natural and anthropogenic disturbances in aquatic ecosystems underpinning monitoring programmes on the ecological quality of marine ecosystems, very useful to assess the GES of the marine environment. dT-RFLP (directed Terminal-Restriction Fragment Length Polymorphism) allows to assess the diversity of nematode communities, but also allows studying the functioning of the ecosystem, and combined with relative real-time PCR (qPCR), provides a high-throughput semi-quantitative characterization of nematode communities. These characteristics make the two molecular tools good descriptors for the good environmental status assessment. The main aim of this study is to develop and optimize the dT-RFLP and qPCR in Mira estuary (SW coast, Portugal). A molecular phylogenetic analysis of marine and estuarine nematodes is being performed combining morphological and molecular analysis to evaluate the diversity of free-living marine nematodes in Mira estuary. After morphological identification, barcoding of 18S rDNA and COI genes are being determined for each nematode species morphologically identified. So far we generated 40 new sequences belonging to 32 different genus and 17 families, and the study has shown a good degree of concordance between traditional morphology-based identification and DNA sequences. These results will improve the assessment of marine nematode diversity and contribute to a more robust nematode taxonomy. The DNA sequences are being used to develop the dT-RFLP with the ability to easily process large sample numbers (hundreds and thousands), rather than typical of classical taxonomic or low throughput molecular analyses. A preliminary study showed that the digest enzymes used in dT-RFLP for terrestrial assemblages separated poorly the marine nematodes at taxonomic level for functional group analysis. A new digest combination was designed using the software tool DRAT (Directed Terminal Restriction Analysis Tool) to distinguished marine nematode taxa. Several solutions were provided by DRAT and tested empirically to select the solution that cuts most efficiently. A combination of three enzymes and a single digest showed to be the best solution to separate the different clusters. Parallel to this, another tool is being developed to estimate the population size (qPCR). An improvement in qPCR estimation of gene copy number using an artificial reference is being performed for marine nematodes communities to quantify the abundance. Once developed, it is proposed to validate both methodologies by determining the spatial and temporal variability of benthic nematodes assemblages across different environments. The application of these high-throughput molecular approaches for benthic nematodes will improve sample throughput and their implementation more efficient and faster as indicator of ecological status of marine ecosystems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The functional and structural performance of a 5 cm synthetic small diameter vascular graft (SDVG) produced by the copolymerization of polyvinyl alcohol hydrogel with low molecular weight dextran (PVA/Dx graft) associated to mesenchymal stem cells (MSCs)-based therapies and anticoagulant treatment with heparin, clopidogrel and warfarin was tested using the ovine model during the healing period of 24 weeks. The results were compared to the ones obtained with standard expanded polyetetrafluoroethylene grafts (ePTFE graft). Blood flow, vessel and graft diameter measurements, graft appearance and patency rate (PR), thrombus, stenosis and collateral vessel formation were evaluated by B-mode ultrasound, audio and color flow Doppler. Graft and regenerated vessels morphologic evaluation was performed by scanning electronic microscopy (SEM), histopathological and immunohistochemical analysis. All PVA/Dx grafts could maintain a similar or higher PR and systolic / diastolic laminar blood flow velocities were similar to ePTFE grafts. CD14 (macrophages) and α-actin (smooth muscle) staining presented similar results in PVA/Dx/MSCs and ePTFE graft groups. Fibrosis layer was lower and endothelial cells were only detected at graft-artery transitions where it was added the MSCs. In conclusion, PVA/Dx graft can be an excellent scaffold candidate for vascular reconstruction, including clinic mechanically challenging applications, such as SDVGs, especially when associated to MSCs-based therapies to promote higher endothelialization and lower fibrosis of the vascular prosthesis, but also higher PR values.