927 resultados para Weak and Strong Solutions


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The phenomenon of terrorism is one of the most asymmetrical, amorphous and hybrid threats to international security. At the beginning of the 21st century, terrorism grew to a pandemic. Ensuring freedom and security of individuals and nations has become one of the priority postulates. Terrorism steps out of all legal and analytic-descriptive standards. An immanent feature of terrorism, e.g. is constant conversion into malicious forms of violence. One of the most alarming changes is a tendency for debasement of essence of law, a state and human rights Assurance of safety in widely accessible public places and in private life forces creation of various institutions, methods and forms of people control. However, one cannot in an arbitrary way limit civil freedom. Presented article stresses the fact that rational and informed approach to human rights should serve as a reference point for legislative and executive bodies. Selected individual applications to the European Court of Human Rights are presented, focusing on those based on which standards regarding protection of human rights in the face of pathological social phenomena, terrorism in particular, could be reconstructed and refined. Strasbourg standards may prove helpful in selecting and constructing new legal and legislative solutions, unifying and correlating prophylactic and preventive actions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magnetic nanoparticles (MNPs) are known for the unique properties conferred by their small size and have found wide application in food safety analyses. However, their high surface energy and strong magnetization often lead to aggregation, compromising their functions. In this study, iron oxide magnetic particles (MPs) over the range of nano to micro size were synthesized, from which particles with less aggregation and excellent magnetic properties were obtained. MPs were synthesized via three different hydrothermal procedures, using poly (acrylic acid) (PAA) of different molecular weight (Mw) as the stabilizer. The particle size, morphology, and magnetic properties of the MPs from these synthesis procedures were characterized and compared. Among the three syntheses, one-step hydrothermal synthesis demonstrated the highest yield and most efficient magnetic collection of the resulting PAA-coated magnetic microparticles (PAA-MMPs, >100 nm). Iron oxide content of these PAA-MMPs was around 90%, and the saturation magnetization ranged from 70.3 emu/g to 57.0 emu/g, depending on the Mw of PAA used. In this approach, the particles prepared using PAA with Mw of 100K g/mol exhibited super-paramagnetic behavior with ~65% lower coercivity and remanence compared to others. They were therefore less susceptible to aggregation and remained remarkably water-dispersible even after one-month storage. Three applications involving PAA-MMPs from one-step hydrothermal synthesis were explored: food proteins and enzymes immobilization, antibody conjugation for pathogen capture, and magnetic hydrogel film fabrication. These studies demonstrated their versatile functions as well as their potential applications in the food science area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are various tools for monitoring the concentration of pollutants on aquatic ecosystems. Today these studies are based on biological monitoring and biomarkers. The aim of this study was to measure the concentration of the acetylcholinesterase (AChE), glutathione S-transferase and catalase as biomarkers of heavy metal contamination in pearl oyster Pinctada radiata and their mechanism in aquatic ecosystems. Heavy metals lead, cadmium and nickel were measured in soft tissue and studied stations in four seasons. Samples were collected seasonally in Lavan stations, Hendurabi and Nakhilo (in the northern Persian Gulf) from spring 2013 to winter of that year by scuba diving. Pearl oysters are divided according to their shells size; shells separated from soft tissues and were transferred to the laboratory for analysis of heavy metals and enzymes. Moopam standard method for were used for measuring the concentration of heavy metals and for analyzing tissue concentrations of glutathione S-transferase in Clam the method recommended by Habig et al in 1974 were used. For measuring acetylcholinesterase Ellman method were used. Catalase contamination in pearl oyster in the supernatant obtained from the study based on the method homogeate soft tissue of mussels (Abei, 1974) was evaluated. The results showed that the concentration of lead has significant difference in sediments station, the concentration of lead in Lavan is significantly higher than the other two stations, This could be due to the movement of tanker, boats and floating refueling and with a considerable amount of wastewater containing oil and Petroleum into the water, and also due to precipitation and industrial discharges the lead in the region is increasing, land-disposed sewage sludge, has large concentrations of lead. Compare the results of this study with standards related and other similar studies at the regional and international level showed that pollutant concentration of heavy metals in all cases significantly less than all the standards and guide values associated. And also compared to other world research results have been far less than others, Being Less of the conclusion given in this research according that nickel is one of the indicators of oil pollution in the study area and emissions have been relatively low of oil. The concentration of acetylcholinesterase at several stations, in large and small sizes and in the seasons had no significant difference. Variations of catalase, and glutathione S-transferase were almost similar to each other and parameters, station and seasons were significantly different in the concentrations of these enzymes. The effects and interaction between various parameters indicate that following parameters has impact on the concentration of catalase and glutathione S-transferase. Stations; Seasonal changes in antioxidant enzymes related to (assuming a constant in salinity and oxygen) to age, reproductive cycle, availability of food and water temperature. With increasing temperature at warm season, antioxidant enzymes were increase, with increasing temperature and abundance of food in the environment the amount of antioxidant enzymes may increase. The presence of the enzyme concentration may indicate that the higher levels of the enzyme to eliminate ROS activities to be any healthier situation. At the time of gonads maturation and spawning season catalase activity increases. This study also indicates that catalase was significantly higher in the warm season. Due to low pollutants of heavy metals in the study area, a lower level of contaminants were observed in shellfish tissue incidents of international standards and strong correlation between the amount of heavy metal contamination in pearl oyster tissue and enzymes was not observed. Therefore, we can say that the pearl oyster remains in a healthy condition and the amount of enzyme is normal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis identifies and defines the new African sovereignty. It establishes a modern sovereignty in Africa hatched from the changing nature of sovereignty in which countries come together at various levels or grades of partial surrender of national sovereignty in order to work closer together for their mutual advantage and benefit. To this end, the narrative zooms in on the central issues within the realms of money matters whereby a new model of monetary sovereignty and monetary solutions is designed in an attempt to ease the recurring tensions and challenges of modern national sovereignty in the continent of Africa. As such, this discussion will offer a historical journey through the constitution of sovereignty, to the birth of the nation state and international public law. It develops the theory of the changing nature of sovereignty within the modern state and opens new lines of inquiry for Africa. In this regard, it draws from juxtaposing and mixing elements of regional and global financial integration as well as retaining national financial sovereignty features to form this new design which I dub continental sovereignty. At its core, the thesis will deal with the legal aspects that stem from the co-mingling of legal systems of nation states and communities at the regional and global levels within the context of financial integration. The argument is that the rule of law remains sacrosanct in monetary management. Effective financial integration is the result of properly structured and managed legal frameworks with robust laws and institutions whether at a national, regional or global level. However, the thesis reveals that in order to avoid undermining the progress of Africa’s financial integration project, any solution for Africa must be immersed within a broader global solution where development issues are addressed and resolved and Africa can form a more central part in all relevant international discussion fora. The work will expound these issues by applying them within a regional and global context, with the state of affairs in Africa forming the nucleus. This application consequently presents the six key themes of the thesis which will be considered therein. They are: a.) regional advantage: which exploits the possibilities of deeper and further financial integration between smaller communal arrangements; b.) regional risk and exposure: the extent to which this deeper form of financial integration can spiral out of control if effected too quickly and too ambitiously; c.) global advantage: which considers the merits of global financial integration and the influence exerted by financial laws on the global financial architecture; d.) global risk and exposure: which considers the challenges of global financial integration especially within the background of the Global Financial Crisis 2007-2008; e.) African challenge: which considers the extent to which this analysis impacts the African economic and financial integration agenda; and f.) development challenge: which examines the extent to which global development issues impact the African solution (continental sovereignty) and the need for any solution for the continent to be roped into a broader global solution within which Africa can form an important part. Even though the thesis requests an optimistic undertone on the progress made so far, it unearths the African problem of multiple national sovereignty and multiple overlapping regional sovereignty constituted as the ‘spaghetti bowl’ dilemma. As such, the unique contribution to knowledge on financial integration in Africa can be echoed in these words: Africa‘s financial integration agenda has had little success in authenticating a systematic and dependable legal framework for monetary management. Efforts made have been incomplete, substandard, and not carefully followed through particularly reflected in the impuissant nature of the judicial enforcement mechanisms. Thus, the thesis argues that, any meaningful answer to the problems dogging the continent is inter alia deeply entrenched within a new form of cooperative monetary sovereignty. In other words, the thesis does not prescribe the creation of new laws; rather it advocates the effective enforcement of existing laws.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis proposes a generic visual perception architecture for robotic clothes perception and manipulation. This proposed architecture is fully integrated with a stereo vision system and a dual-arm robot and is able to perform a number of autonomous laundering tasks. Clothes perception and manipulation is a novel research topic in robotics and has experienced rapid development in recent years. Compared to the task of perceiving and manipulating rigid objects, clothes perception and manipulation poses a greater challenge. This can be attributed to two reasons: firstly, deformable clothing requires precise (high-acuity) visual perception and dexterous manipulation; secondly, as clothing approximates a non-rigid 2-manifold in 3-space, that can adopt a quasi-infinite configuration space, the potential variability in the appearance of clothing items makes them difficult to understand, identify uniquely, and interact with by machine. From an applications perspective, and as part of EU CloPeMa project, the integrated visual perception architecture refines a pre-existing clothing manipulation pipeline by completing pre-wash clothes (category) sorting (using single-shot or interactive perception for garment categorisation and manipulation) and post-wash dual-arm flattening. To the best of the author’s knowledge, as investigated in this thesis, the autonomous clothing perception and manipulation solutions presented here were first proposed and reported by the author. All of the reported robot demonstrations in this work follow a perception-manipulation method- ology where visual and tactile feedback (in the form of surface wrinkledness captured by the high accuracy depth sensor i.e. CloPeMa stereo head or the predictive confidence modelled by Gaussian Processing) serve as the halting criteria in the flattening and sorting tasks, respectively. From scientific perspective, the proposed visual perception architecture addresses the above challenges by parsing and grouping 3D clothing configurations hierarchically from low-level curvatures, through mid-level surface shape representations (providing topological descriptions and 3D texture representations), to high-level semantic structures and statistical descriptions. A range of visual features such as Shape Index, Surface Topologies Analysis and Local Binary Patterns have been adapted within this work to parse clothing surfaces and textures and several novel features have been devised, including B-Spline Patches with Locality-Constrained Linear coding, and Topology Spatial Distance to describe and quantify generic landmarks (wrinkles and folds). The essence of this proposed architecture comprises 3D generic surface parsing and interpretation, which is critical to underpinning a number of laundering tasks and has the potential to be extended to other rigid and non-rigid object perception and manipulation tasks. The experimental results presented in this thesis demonstrate that: firstly, the proposed grasp- ing approach achieves on-average 84.7% accuracy; secondly, the proposed flattening approach is able to flatten towels, t-shirts and pants (shorts) within 9 iterations on-average; thirdly, the proposed clothes recognition pipeline can recognise clothes categories from highly wrinkled configurations and advances the state-of-the-art by 36% in terms of classification accuracy, achieving an 83.2% true-positive classification rate when discriminating between five categories of clothes; finally the Gaussian Process based interactive perception approach exhibits a substantial improvement over single-shot perception. Accordingly, this thesis has advanced the state-of-the-art of robot clothes perception and manipulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new procedure was developed in this study, based on a system equipped with a cellulose membrane and a tetraethylenepentamine hexaacetate chelator (MD-TEPHA) for in situ characterization of the lability of metal species in aquatic systems. To this end, the DM-TEPHA system was prepared by adding TEPHA chelator to cellulose bags pre-purified with 1.0 mol L-1 of HCl and NaOH solutions. After the MD-TEPHA system was sealed, it was examined in the laboratory to evaluate the influence of complexation time (0-24 h), pH (3.0, 4.0, 5.0, 6.0 and 7.0), metal ions (Cu, Cd, Fe, Mn and Ni) and concentration of organic matter (15, 30 and 60 mg L-1) on the relative lability of metal species by TEPHA chelator. The results showed that Fe and Cu metals were complexed more slowly by TEPHA chelator in the MD-TEPHA system than were Cd, Ni and Mn in all pH used. It was also found that the pH strongly influences the process of metal complexation by the MD-TEPHA system. At all the pH levels, Cd, Mn and Ni showed greater complexation with TEPHA chelator (recovery of about 95-75%) than did Cu and Fe metals. Time also affects the lability of metal species complexed by aquatic humic substances (AHS); while Cd, Ni and Mn showed a faster kinetics, reaching equilibrium after about 100 min, and Cu and Fe approached equilibrium after 400 min. Increasing the AHS concentration decreases the lability of metal species by shifting the equilibrium to AHS-metal complexes. Our results indicate that the system under study offers an interesting alternative that can be applied to in situ experiments for differentiation of labile and inert metal species in aquatic systems. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimum fluoride intake plays an essential role in the prevention of dental caries while fluoride consumption above recommended level interferes with the normal formation of tooth enamel and bones and may increase risk of dental and skeletal fluorosis. The knowledge and practices of endemic communities on etiology of fluorosis will help in its mitigation and prevention. The objective of this study was to investigate the knowledge, attitude and practices of endemic community on fluoride contamination, fluorosis and prevention practices in order to devise coordinated and targeted prevention mechanisms. Focus group discussions (FGD) and key-informant interview were conducted in three dietary areas to collect knowledge, attitude and practices (KAP) of the endemic community in July 2013.The results indicated that health consequences of fluoride contaminated water are fairly understood. None of the discussants mentioned the word “fluoride”. The knowledge and perception of the community on fluoride ingestion is poor. Health extension workers (HEWs) did not teach about fluoride and related health consequences. Dental fluorosis was reported to start at early ages and not commonly perceived as a major problem. However, adolescents worried and felt that they might be singled out when going to other areas. Older people have a skeletal fluorosis, which interferes with their day to day activities. In severely affected people, the teeth were weak and fragile and thus create difficulty in chewing hard foods like unfermented dry flat bread, sugar cane and toasted grains. People prefer rain water rather than water from borehole because of the inconvenient taste of the latter. The endemic communities have no sufficient knowledge and skills on potential sources of fluoride intake, the debilitating effect of high fluoride ingestion, and preventive and mitigatory measures to reduce fluoride intake. The effect of fluoride contamination and mitigatory methods should get sufficient attention by the community, health workers and concerned governmental bodies. The trend of harvesting and using rain water should be encouraged as it reduces fluoride intake. Future studies should focus on information communication on possible fluoride risks, intervention and evaluation studies on defluoridation, rain water harvesting and mitigatory techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scientific curiosity, exploration of georesources and environmental concerns are pushing the geoscientific research community toward subsurface investigations of ever-increasing complexity. This review explores various approaches to formulate and solve inverse problems in ways that effectively integrate geological concepts with geophysical and hydrogeological data. Modern geostatistical simulation algorithms can produce multiple subsurface realizations that are in agreement with conceptual geological models and statistical rock physics can be used to map these realizations into physical properties that are sensed by the geophysical or hydrogeological data. The inverse problem consists of finding one or an ensemble of such subsurface realizations that are in agreement with the data. The most general inversion frameworks are presently often computationally intractable when applied to large-scale problems and it is necessary to better understand the implications of simplifying (1) the conceptual geological model (e.g., using model compression); (2) the physical forward problem (e.g., using proxy models); and (3) the algorithm used to solve the inverse problem (e.g., Markov chain Monte Carlo or local optimization methods) to reach practical and robust solutions given today's computer resources and knowledge. We also highlight the need to not only use geophysical and hydrogeological data for parameter estimation purposes, but also to use them to falsify or corroborate alternative geological scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.

The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.

The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).

The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.

The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.

In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Entangled quantum states can be given a separable decomposition if we relax the restriction that the local operators be quantum states. Motivated by the construction of classical simulations and local hidden variable models, we construct `smallest' local sets of operators that achieve this. In other words, given an arbitrary bipartite quantum state we construct convex sets of local operators that allow for a separable decomposition, but that cannot be made smaller while continuing to do so. We then consider two further variants of the problem where the local state spaces are required to contain the local quantum states, and obtain solutions for a variety of cases including a region of pure states around the maximally entangled state. The methods involve calculating certain forms of cross norm. Two of the variants of the problem have a strong relationship to theorems on ensemble decompositions of positive operators, and our results thereby give those theorems an added interpretation. The results generalise those obtained in our previous work on this topic [New J. Phys. 17, 093047 (2015)].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the context of ƒ (R) gravity theories, we show that the apparent mass of a neutron star as seen from an observer at infinity is numerically calculable but requires careful matching, first at the star’s edge, between interior and exterior solutions, none of them being totally Schwarzschild-like but presenting instead small oscillations of the curvature scalar R; and second at large radii, where the Newtonian potential is used to identify the mass of the neutron star. We find that for the same equation of state, this mass definition is always larger than its general relativistic counterpart. We exemplify this with quadratic R^2 and Hu-Sawicki-like modifications of the standard General Relativity action. Therefore, the finding of two-solar mass neutron stars basically imposes no constraint on stable ƒ (R) theories. However, star radii are in general smaller than in General Relativity, which can give an observational handle on such classes of models at the astrophysical level. Both larger masses and smaller matter radii are due to much of the apparent effective energy residing in the outer metric for scalar-tensor theories. Finally, because the ƒ (R) neutron star masses can be much larger than General Relativity counterparts, the total energy available for radiating gravitational waves could be of order several solar masses, and thus a merger of these stars constitutes an interesting wave source.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To investigate the efficiency of silver nanoparticles synthesized by wet chemical method, and evaluate their antibacterial and anti-cancer activities. Methods: Wet chemical method was used to synthesize silver nanoparticles (AgNPs) from silver nitrate, trisodium citrate dehydrate (C6H5O7Na3.2H2O) and sodium borohydride (NaBH4) as reducing agent. The AgNPs and the reaction process were characterized by UV–visible spectrometry, zetasizer, transmission electron microscopy (TEM) and scanning electron microscopy (SEM) equipped with energy dispersive spectroscopy (EDS). The antibacterial and cytotoxic effects of the synthesized nanoparticles were investigated by agar diffusion method and MTT assay respectively. Results: The silver nanoparticles formed were spherical in shape with mean size of 10.3 nm. The results showed good antibacterial properties, killing both Gram-positive and Gram-negative bacteria, and its aqueous suspension displayed cytotoxic activity against colon adenocarcinoma (HCT-116) cell line. Conclusion: The findings indicate that silver nanoparticles synthesized by wet chemical method demonstrate good cytotoxic activity in colon adenocarcinoma (HCT-116) cell lines and strong antibacterial activity against various strains of bacteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of sharing a cost M among n individuals, identified by some characteristic ci∈R+,ci∈R+, appears in many real situations. Two important proposals on how to share the cost are the egalitarian and the proportional solutions. In different situations a combination of both distributions provides an interesting approach to the cost sharing problem. In this paper we obtain a family of (compromise) solutions associated to the Perron’s eigenvectors of Levinger’s transformations of a characteristics matrix A. This family includes both the egalitarian and proportional solutions, as well as a set of suitable intermediate proposals, which we analyze in some specific contexts, as claims problems and inventory cost games.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.