675 resultados para Congestion
Resumo:
The dissertation is concerned with the mathematical study of various network problems. First, three real-world networks are considered: (i) the human brain network (ii) communication networks, (iii) electric power networks. Although these networks perform very different tasks, they share similar mathematical foundations. The high-level goal is to analyze and/or synthesis each of these systems from a “control and optimization” point of view. After studying these three real-world networks, two abstract network problems are also explored, which are motivated by power systems. The first one is “flow optimization over a flow network” and the second one is “nonlinear optimization over a generalized weighted graph”. The results derived in this dissertation are summarized below.
Brain Networks: Neuroimaging data reveals the coordinated activity of spatially distinct brain regions, which may be represented mathematically as a network of nodes (brain regions) and links (interdependencies). To obtain the brain connectivity network, the graphs associated with the correlation matrix and the inverse covariance matrix—describing marginal and conditional dependencies between brain regions—have been proposed in the literature. A question arises as to whether any of these graphs provides useful information about the brain connectivity. Due to the electrical properties of the brain, this problem will be investigated in the context of electrical circuits. First, we consider an electric circuit model and show that the inverse covariance matrix of the node voltages reveals the topology of the circuit. Second, we study the problem of finding the topology of the circuit based on only measurement. In this case, by assuming that the circuit is hidden inside a black box and only the nodal signals are available for measurement, the aim is to find the topology of the circuit when a limited number of samples are available. For this purpose, we deploy the graphical lasso technique to estimate a sparse inverse covariance matrix. It is shown that the graphical lasso may find most of the circuit topology if the exact covariance matrix is well-conditioned. However, it may fail to work well when this matrix is ill-conditioned. To deal with ill-conditioned matrices, we propose a small modification to the graphical lasso algorithm and demonstrate its performance. Finally, the technique developed in this work will be applied to the resting-state fMRI data of a number of healthy subjects.
Communication Networks: Congestion control techniques aim to adjust the transmission rates of competing users in the Internet in such a way that the network resources are shared efficiently. Despite the progress in the analysis and synthesis of the Internet congestion control, almost all existing fluid models of congestion control assume that every link in the path of a flow observes the original source rate. To address this issue, a more accurate model is derived in this work for the behavior of the network under an arbitrary congestion controller, which takes into account of the effect of buffering (queueing) on data flows. Using this model, it is proved that the well-known Internet congestion control algorithms may no longer be stable for the common pricing schemes, unless a sufficient condition is satisfied. It is also shown that these algorithms are guaranteed to be stable if a new pricing mechanism is used.
Electrical Power Networks: Optimal power flow (OPF) has been one of the most studied problems for power systems since its introduction by Carpentier in 1962. This problem is concerned with finding an optimal operating point of a power network minimizing the total power generation cost subject to network and physical constraints. It is well known that OPF is computationally hard to solve due to the nonlinear interrelation among the optimization variables. The objective is to identify a large class of networks over which every OPF problem can be solved in polynomial time. To this end, a convex relaxation is proposed, which solves the OPF problem exactly for every radial network and every meshed network with a sufficient number of phase shifters, provided power over-delivery is allowed. The concept of “power over-delivery” is equivalent to relaxing the power balance equations to inequality constraints.
Flow Networks: In this part of the dissertation, the minimum-cost flow problem over an arbitrary flow network is considered. In this problem, each node is associated with some possibly unknown injection, each line has two unknown flows at its ends related to each other via a nonlinear function, and all injections and flows need to satisfy certain box constraints. This problem, named generalized network flow (GNF), is highly non-convex due to its nonlinear equality constraints. Under the assumption of monotonicity and convexity of the flow and cost functions, a convex relaxation is proposed, which always finds the optimal injections. A primary application of this work is in the OPF problem. The results of this work on GNF prove that the relaxation on power balance equations (i.e., load over-delivery) is not needed in practice under a very mild angle assumption.
Generalized Weighted Graphs: Motivated by power optimizations, this part aims to find a global optimization technique for a nonlinear optimization defined over a generalized weighted graph. Every edge of this type of graph is associated with a weight set corresponding to the known parameters of the optimization (e.g., the coefficients). The motivation behind this problem is to investigate how the (hidden) structure of a given real/complex valued optimization makes the problem easy to solve, and indeed the generalized weighted graph is introduced to capture the structure of an optimization. Various sufficient conditions are derived, which relate the polynomial-time solvability of different classes of optimization problems to weak properties of the generalized weighted graph such as its topology and the sign definiteness of its weight sets. As an application, it is proved that a broad class of real and complex optimizations over power networks are polynomial-time solvable due to the passivity of transmission lines and transformers.
Resumo:
This thesis belongs to the growing field of economic networks. In particular, we develop three essays in which we study the problem of bargaining, discrete choice representation, and pricing in the context of networked markets. Despite analyzing very different problems, the three essays share the common feature of making use of a network representation to describe the market of interest.
In Chapter 1 we present an analysis of bargaining in networked markets. We make two contributions. First, we characterize market equilibria in a bargaining model, and find that players' equilibrium payoffs coincide with their degree of centrality in the network, as measured by Bonacich's centrality measure. This characterization allows us to map, in a simple way, network structures into market equilibrium outcomes, so that payoffs dispersion in networked markets is driven by players' network positions. Second, we show that the market equilibrium for our model converges to the so called eigenvector centrality measure. We show that the economic condition for reaching convergence is that the players' discount factor goes to one. In particular, we show how the discount factor, the matching technology, and the network structure interact in a very particular way in order to see the eigenvector centrality as the limiting case of our market equilibrium.
We point out that the eigenvector approach is a way of finding the most central or relevant players in terms of the “global” structure of the network, and to pay less attention to patterns that are more “local”. Mathematically, the eigenvector centrality captures the relevance of players in the bargaining process, using the eigenvector associated to the largest eigenvalue of the adjacency matrix of a given network. Thus our result may be viewed as an economic justification of the eigenvector approach in the context of bargaining in networked markets.
As an application, we analyze the special case of seller-buyer networks, showing how our framework may be useful for analyzing price dispersion as a function of sellers and buyers' network positions.
Finally, in Chapter 3 we study the problem of price competition and free entry in networked markets subject to congestion effects. In many environments, such as communication networks in which network flows are allocated, or transportation networks in which traffic is directed through the underlying road architecture, congestion plays an important role. In particular, we consider a network with multiple origins and a common destination node, where each link is owned by a firm that sets prices in order to maximize profits, whereas users want to minimize the total cost they face, which is given by the congestion cost plus the prices set by firms. In this environment, we introduce the notion of Markovian traffic equilibrium to establish the existence and uniqueness of a pure strategy price equilibrium, without assuming that the demand functions are concave nor imposing particular functional forms for the latency functions. We derive explicit conditions to guarantee existence and uniqueness of equilibria. Given this existence and uniqueness result, we apply our framework to study entry decisions and welfare, and establish that in congested markets with free entry, the number of firms exceeds the social optimum.
Resumo:
This thesis describes the design and implementation of a situation awareness application. The application gathers data from sensors including accelerometers for monitoring earthquakes, carbon monoxide sensors for monitoring fires, radiation detectors, and dust sensors. The application also gathers Internet data sources including data about traffic congestion on daily commute routes, information about hazards, news relevant to the user of the application, and weather. The application sends the data to a Cloud computing service which aggregates data streams from multiple sites and detects anomalies. Information from the Cloud service is then displayed by the application on a tablet, computer monitor, or television screen. The situation awareness application enables almost all members of a community to remain aware of critical changes in their environments.
Resumo:
The rapid growth and development of Los Angeles City and County has been one of the phenomena of the present age. The growth of a city from 50,600 to 576,000, an increase of over 1000% in thirty years is an unprecedented occurrence. It has given rise to a variety of problems of increasing magnitude.
Chief among these are: supply of food, water and shelter development of industry and markets, prevention and removal of downtown congestion and protection of life and property. These, of course, are the problems that any city must face. But in the case of a community which doubles its population every ten years, radical and heroic measures must often be taken.
Resumo:
[ES]En este proyecto se realizará un análisis del protocolo de nivel de transporte multi-path TCP en una maqueta de pruebas. Para poder llevar a cabo este análisis del protocolo, diseñaremos y construiremos un entorno de pruebas controlado en el que simularemos diferentes problemas y situaciones en la red y analizaremos las respuestas del protocolo ante estos fallos.
Resumo:
A testosterona tem sido cada vez mais usada em homens na fase do envelhecimento como prevenção e tratamento de doenças metabólicas, melhora do desempenho sexual, proteção cardiovascular e manutenção da cognição. Porém ainda há conflito sobre seus efeitos na próstata com relação às doenças benignas e malignas. O presente estudo avaliou o efeito do tratamento com duas formas de testosterona sobre carcinoma de próstata induzido por N-Metil-N-Nitrosureia (NMU) a partir de análises histopatológicas e séricas do antígeno prostático específico (PSA). Para tal foram utilizados 80 ratos Wistar jovens, sadios, divididos em dois grupos (40 animais cada) tratados ou não com NMU intraperitoneal. Cada grupo foi dividido em quatro subgrupos iguais e tratados durante 16 semanas: 1) tratado com cipionato de testosterona a cada sete dias via intramuscular; 2) tratado com cipionato de testosterona a cada 14 dias via intramuscular; 3) tratado com undecanoato de testosterona oral diariamente; 4) tratado com óleo mineral. Após 16 semnanas e tratamento, os níveis do PSA não alteraram em nenhum grupo ou subgrupo e não houve desenvolvimento de tumores em nenhum deles. Portanto, as duas formas distintas de testosterona associada ao uso de NMU em curto espaço de tempo por via intraperitoneal não alteraram as dosagens séricas do PSA e não induziram a formação de tumores na próstata em ratos Wistar jovens e saudáveis. As alterações histopatológicas acinares encontradas nas próstatas foram projeção, secreção, congestão e inflamação, e as epiteliais foram: epitélio normal, redução do epitélio e redução na altura do mesmo. Tais achados colaboram para que outros estudos sejam realizados de maneira a orientar o uso de testosterona na prática clinica diária sem receio de indução do câncer na próstata.
Resumo:
A Diabetes Mellitus Gestacional (DMG) pode ser definida como intolerância a carboidrato durante a gravidez e estima-se que pode afetar 10-22% de todas as pacientes grávidas. Durante a gravidez podem surgir diversas complicações para o feto como risco elevado de aborto espontâneo, anormalidades congênitas e morbidade e mortalidade neonatal. Entretanto, podem surgir também alterações morfofuncionais em diversos órgãos da mãe diabética, porém isso não é bem estabelecido. Investigar se haverá ou não alterações bioquímicas e histopatológicas em diversos órgãos, como hipófise, útero, placenta e pâncreas de ratas grávidas com diabetes mellitus durante e no final da gravidez e compará-las . Além disso, investigar se há alteração na matriz extracelular (MEC) da hipófise desses animais. No 5 dia de vida, ratas Wistar foram divididas em dois grupos: um tratado com estreptozotocina (Grupo Diabético / DIAB), na dose de 90 mg/kg, subcutâneo e outro grupo, que foi tratado com veículo (tampão citrato/CTR). Aos 90 dias de vidas, foram submetidas ao cruzamento. Após isso, foram sacrificadas no 11 e 21 dia de gravidez. Foram avaliados glicemia e bioquímica maternal e número de implantes .O pâncreas, útero, placenta e hipófises foram coradas com Hematoxilina e Eosina e somente as hipófises foram coradas com Massom e Picrosirius, para avaliação da MEC.Os animais diabéticos tanto do 11 quanto do 21 dia apresentaram uma redução no número de implantes, menor peso e maior glicemia e colesterol total, em relação aos animais controle independente do dia da gravidez. Não foi verificada diferença dos níveis de triglicerídeos entre os grupos não diabéticos e diabéticos, independente dos dias. Entretanto, os animais diabéticos que finalizaram o período de gestação apresentaram uma maior glicemia maternal em relação ao grupo diabético do 11 dia. Pâncreas de ratas diabéticas do 21 dia apresentaram vacuolização intracitoplasmática das ilhotas, insulite,migração de células inflamatórias, espessamento da parede do vaso e fibrose periductal e vascular. Essas alterações foram verificadas com bem menor intensidade nos animais diabéticos do 11 dia. Foi verificado que a placenta de animais diabéticos apresenta congestão na interface materno-fetal, migração celular, maior concentração de vasos maternos e fetais, mas em forma irregular , necrose e vacuolização. A hipófise de animais diabéticos mostraram células cromófobas agregadas, aumento da espessura de fibras de colágeno vermelhas da MEC, em contraste com o controle, que foi visualizado fibras em verde e em formato de feixe. A diabetes desempenhou um total remodelamento da hipófise. Gravidez de animais diabeticos mostraram maior dano ao pâncreas e placenta, especialmente no final da gravidez. Em consequência dessa alterações, esses animais diabéticos apresentaram hiperglicemia, maior colesterol total, porém menor peso materno, número de implantes e sem alterações nos triglicerídeos. Esse é o primeiro estudo a demonstrar remodelamento tecidual em alguns elementos da MEC na hipófise, como espessamento da camada da MEC e fibras de colágeno em verde. Alterações da MEC da hipófise são provavelmente devido ao processo de diabetes na gestação.
Resumo:
O crescimento dos serviços de banda-larga em redes de comunicações móveis tem provocado uma demanda por dados cada vez mais rápidos e de qualidade. A tecnologia de redes móveis chamada LTE (Long Term Evolution) ou quarta geração (4G) surgiu com o objetivo de atender esta demanda por acesso sem fio a serviços, como acesso à Internet, jogos online, VoIP e vídeo conferência. O LTE faz parte das especificações do 3GPP releases 8 e 9, operando numa rede totalmente IP, provendo taxas de transmissão superiores a 100 Mbps (DL), 50 Mbps (UL), baixa latência (10 ms) e compatibilidade com as versões anteriores de redes móveis, 2G (GSM/EDGE) e 3G (UMTS/HSPA). O protocolo TCP desenvolvido para operar em redes cabeadas, apresenta baixo desempenho sobre canais sem fio, como redes móveis celulares, devido principalmente às características de desvanecimento seletivo, sombreamento e às altas taxas de erros provenientes da interface aérea. Como todas as perdas são interpretadas como causadas por congestionamento, o desempenho do protocolo é ruim. O objetivo desta dissertação é avaliar o desempenho de vários tipos de protocolo TCP através de simulações, sob a influência de interferência nos canais entre o terminal móvel (UE User Equipment) e um servidor remoto. Para isto utilizou-se o software NS3 (Network Simulator versão 3) e os protocolos TCP Westwood Plus, New Reno, Reno e Tahoe. Os resultados obtidos nos testes mostram que o protocolo TCP Westwood Plus possui um desempenho melhor que os outros. Os protocolos TCP New Reno e Reno tiveram desempenho muito semelhante devido ao modelo de interferência utilizada ter uma distribuição uniforme e, com isso, a possibilidade de perdas de bits consecutivos é baixa em uma mesma janela de transmissão. O TCP Tahoe, como era de se esperar, apresentou o pior desempenho dentre todos, pois o mesmo não possui o mecanismo de fast recovery e sua janela de congestionamento volta sempre para um segmento após o timeout. Observou-se ainda que o atraso tem grande importância no desempenho dos protocolos TCP, mas até do que a largura de banda dos links de acesso e de backbone, uma vez que, no cenário testado, o gargalo estava presente na interface aérea. As simulações com erros na interface aérea, introduzido com o script de fading (desvanecimento) do NS3, mostraram que o modo RLC AM (com reconhecimento) tem um desempenho melhor para aplicações de transferência de arquivos em ambientes ruidosos do que o modo RLC UM sem reconhecimento.
Resumo:
Hoje em dia, distribuições de grandes volumes de dados por redes TCP/IP corporativas trazem problemas como a alta utilização da rede e de servidores, longos períodos para conclusão e maior sensibilidade a falhas na infraestrutura de rede. Estes problemas podem ser reduzidos com utilização de redes par-a-par (P2P). O objetivo desta dissertação é analisar o desempenho do protocolo BitTorrent padrão em redes corporativas e também realizar a análise após uma modificação no comportamento padrão do protocolo BitTorrent. Nesta modificação, o rastreador identifica o endereço IP do par que está solicitando a lista de endereços IP do enxame e envia somente aqueles pertencentes à mesma rede local e ao semeador original, com o objetivo de reduzir o tráfego em redes de longa distância. Em cenários corporativos típicos, as simulações mostraram que a alteração é capaz de reduzir o consumo médio de banda e o tempo médio dos downloads, quando comparados ao BitTorrent padrão, além de conferir maior robustez à distribuição em casos de falhas em enlaces de longa distância. As simulações mostraram também que em ambientes mais complexos, com muitos clientes, e onde a restrição de banda em enlaces de longa distância provoca congestionamento e descartes, o desempenho do protocolo BitTorrent padrão pode ser semelhante a uma distribuição em arquitetura cliente-servidor. Neste último caso, a modificação proposta mostrou resultados consistentes de melhoria do desempenho da distribuição.
Resumo:
A mobilidade urbana é um problema em diversos centros urbanos e é agravada pelo número crescente de automóveis e seu uso indiscriminado. Este estudo exploratório-descritivo abordará revisões conceituais e levantamento extenso de dados sobre a função de transporte; o automóvel, quanto a sua origem e simbolismos; o contexto do Brasil e da cidade do Rio de Janeiro; a dependência dos veículos e os impactos do trânsito na sociedade, a fim de explicar a insustentabilidade desse meio de transporte, da forma como tem sido utilizado nas cidades. Dentre os principais impactos causados pela dependência do automóvel, destacam-se os relativos a saúde, com problemas que vão desde complicações no sistema respiratório e circulatório até o comprometimento da saúde mental; qualidade de vida e a relação entre tempo e custos de locomoção; segurança e todo aparato tecnológico de automóveis que protege o usuário em detrimento do público mais vulnerável, como pedestres e ciclistas; morfologia da cidade, que acaba por privilegiar um modal individual e cria novas formas urbanas que demandam mais espaços para automóveis; mudanças climáticas devido à poluição desproporcional, que influencia os padrões bioquímicos de vários ecossistemas, gerando mudanças climáticas; e prejuízos econômicos, estimados por três diferentes metodologias de estudo, que procuraram monetizar o custo dos congestionamentos. A pesquisa propõe diversas atitudes para reverter ou mitigar o uso excessivo do Transporte em Automóveis. Esta contribuição para os estudos da geografia de transportes vislumbra deixar subsídios para que se avance no debate sobre a dependência do automóvel, especialmente em grandes cidades
Resumo:
Pile reuse has become an increasingly popular option in foundation design, mainly due to its potential cost and environmental benefits and the problem of underground congestion in urban areas. However, key geotechnical concerns remain regarding the behavior of reused piles and the modeling of foundation systems involving old and new piles to support building loads of the new structure. In this paper, a design and analysis tool for pile reuse projects will be introduced. The tool allows coupling of superstructure stiffness with the foundation model, and includes an optimization algorithm to obtain the best configuration of new piles to work alongside reused piles. Under the concept of Pareto Optimality, multi-objective optimization analyses can also reveal the relationship between material usage and the corresponding foundation performance, providing a series of reuse options at various foundation costs. The components of this analysis tool will be discussed and illustrated through a case history in London, where 110 existing piles are reused at a site to support the proposed new development. The case history reveals the difficulties faced by foundation reuse in urban areas and demonstrates the application of the design tool to tackle these challenges. © ASCE 2011.
Resumo:
Linear alkylbenzene sulfonate (LAS) are widely used in detergent industry. Due to contaminants entering the water, and the effects of their accumulation in fish, LAS, has a great importance in environmental pollution. In the present study, accumulation of LAS and its histological effects on gill tissue, liver and kidney of Caspian kutum (Rutilus frisii kutum) were studied. Caspian kutum is the most important and most valuable teleosts of the Caspian Sea. Due to releasing Caspian Kutum in rivers and Anzali Lagoon and unlimited entry of wastewater to the aquatic ecosystem, research on the impact of LAS on Caspian kutum is important. In the present study, fish exposed to sublethal concentrations of LAS (0.58, 1.16 and 2.32 mg/l) for 192 hours. Control treatments with three replicates at 0, 24, 48, 72, 96 and 192 hours were done. For assessments of the histological effects of LAS, tissue sections prepared and by using Hematoxylin - Eosin were stained, then the prepared sections, examined by light microscopy. For determination of the bio accumulation of LAS, the soxhlet extraction and solid phase extraction was performed to determine the amount of LAS using HPLC with fluorescence detector. According to results average of bioconcentration factor and LAS concentrations in fish had reached stable levels after approximately 72 h and thus represented steady state BCF values in this species. The value of steady-state bio-concentration factor of total LAS was 33.96 L.Kg- 1 and for each of the homologous C10-n-LAS, C11-n-LAS, C12-n-LAS and C13-n- LAS were 3.84, 6.15, 8.58 and 15.57 L.Kg-1 respectively. According to the results obtained in gills exposed to LAS, histopathological alteration include hypertrophy, lifting of lamella epithelium, edema, clubbing of lamellae hyperplasia, lamellar fusion and aneurysm were seen. In liver tissue exposed to three concentrations of LAS, congestion and dilation of sinusoids, irregular-shaped nuclei and degeneration in the hepatocyte, vacuolar degeneration and necrosis were observed. In kidney exposed to three concentrations of LAS, reduction of the interstitial haematopoietic tissue, degeneration in the epithelial cells of renal tubule, tubular degeneration, necrosis, shrinkage and luminal occlusion were observed. According to the results the most alteration due to exposure to LAS was seen in the gill tissue. None of the control samples showed histological effects of LAS.
Resumo:
The acute toxicity and effects of diazinon on some haematological parameters of kutum (Rutilus frisii kutum, Kamensky, 1901) weighing 613.33 g±157.06 g were studied under static water quality conditions at 15°C ± 2ºC in winter and spring 2009. The effective physical and chemical parameters of water were pH= 7-8.2, dh= 300mg/L (caco3), DO= 7 ppm and T= 15°C±2ºC. The first test was primarily to determine the effects of acute toxicity (LC5096 h) of the agricultural toxicant diazinon (emulsion 60%) on kutum male brood stocks. For this purpose, 4 treatments were used to test toxicity; each treatment was repeated in 3 tanks with 9 fish per treatment and with 180 litres water capacity. After obtaining the final results, the information was analysed statistically with Probit version 1.5 (USEPA, 1985), and we determined the LC10, LC50 and LC90 values at 24 hours, 48 hours, 72 hours and 96 hours; the maximum allowable concentration value (LC5096 h divided by 10) (TRC, 1984); and the degree of toxicity. The second stage of testing consists of four treatments: LC0= 0 as experimental treatment, treatment A with a concentration of LC1= 0.107 mg/L, treatment B with concentration of LC5= 0.157 mg/L, treatment C with concentration of MAC value= 0.04 mg/L. Male brood stocks of kutum were treated with these concentrations for 45 days. Experiments were carried out under static conditions based on the standard TRC, 1984 method over 45 days. Our results show that long-term exposure to diazinon causes a decrease in the erythrocyte count (RBC), haemoglobin (Hb), haematocrit (PCV), mean corpuscular volume (MCV), mean corpuscular haemoglobin (MCH), mean corpuscular haemoglobin concentration (MCHC), leucocyte count (WBC), lymphocyte, testosterone, iron (Fe), sodium (Na), lactate dehydrogenase (LDH), and cholinesterase (CHeS). In addition, diazinon also causes an increase in prolymphocyte, aspartate aminotransferase (AST), cholesterol, alkaline phosphatase (ALP) and adrenaline (P<0.05). There are no significant effects on monocyte, eosinophil, magnesium (Mg), chloride (Cl), glucose (BS), urea (BUN), uric acid (U.A), triglyceride (TG), calcium (Ca), albumin (Alb), total protein (TP), cortisol, noradrenaline and high density lipoprotein (HDL) levels in kutum male brood stocks (P>0.05). Pathology results showed toxin diazinon no effect on average weight and fish body length, the average weight of heart, brain, spleen, liver, kidney and liver index but caueses decrease of gonad weigth and gonad index and also, cause complications of tissue necrosis, vascular congestion, inflammation in the liver, a sharp reduction in the number of glomeruli, necrosis, vascular congestion and haemorage in the kidney, capsule thickening and fibrosis, atrophy, vascular congestion, macrophages release increased, increasing sediment Hemosiderine and thickening of artery walls in the spleen, atrophy, fibrosis and necrosis in testis , vascular congestion, increased distance between the myocardium and fibrous string in heart and neuronal loss, vascular congestion and edema in the brain of kutum male brood stocks.
Resumo:
Impact of phosphamidon, an organophosphorus pesticide and its metabolites viz. dimethyl phosphoric acid and 2-chloro 2-diethyl carbamoylmethyl vinyl acid on histopathology of a common teleost, Labeo rohita was studied by exposing the fish to sub-lethal concentrations which were taken as 1/3rd of LC50 and were equal to 0.0123 ppm for phosphamidon, 0.0160 ppm for dimethyl phosphoric acid and 0.0167 ppm for 2-chloro 2-diethyl carbamoylmethyl vinyl acid respectively. The results revealed that hepatocytes in the liver were markedly swollen and exhibited hydropic degeneration. Fusion of primary lamellae and moderate congestion of blood vessels were evident in the gill. Intestine showed degeneration of mucosa and cellular infiltration in sub-mucosa. LC50 values and histopathological photomicrographs suggest that phosphamidon is more toxic as compared to dimethyl phosphoric and 2-chloro 2-diethyl carbamoylmethyl vinyl acid.
Resumo:
The objective of this study was to identify challenges in civil and environmental engineering that can potentially be solved using data sensing and analysis research. The challenges were recognized through extensive literature review in all disciplines of civil and environmental engineering. The literature review included journal articles, reports, expert interviews, and magazine articles. The challenges were ranked by comparing their impact on cost, time, quality, environment and safety. The result of this literature review includes challenges such as improving construction safety and productivity, improving roof safety, reducing building energy consumption, solving traffic congestion, managing groundwater, mapping and monitoring the underground, estimating sea conditions, and solving soil erosion problems. These challenges suggest areas where researchers can apply data sensing and analysis research.