827 resultados para tech trends
Resumo:
Cobalt ferrite (CoFe2O4) is an engineering material which is used for applications such as magnetic cores, magnetic switches, hyperthermia based tumor treatment, and as contrast agents for magnetic resonance imaging. Utility of ferrites nanoparticles hinges on its size, dispersibility in solutions, and synthetic control over its coercivity. In this work, we establish correlations between room temperature co-precipitation conditions, and these crucial materials parameters. Furthermore, post-synthesis annealing conditions are correlated with morphology, changes in crystal structure and magnetic properties. We disclose the synthesis and process conditions helpful in obtaining easily sinterable CoFe2O4 nanoparticles with coercive magnetic flux density (H-c) in the range 5.5-31.9 kA/m and M-s in the range 47.9-84.9 A.m(2)Kg(-1). At a grain size of similar to 54 +/- 2 nm (corresponding to 1073 K sintering temperature), multi-domain behavior sets in, which is indicated by a decrease in H-c. In addition, we observe an increase in lattice constant with respect to grain size, which is the inverse of what is expected of in ferrites. Our results suggest that oxygen deficiency plays a crucial role in explaining this inverse trend. We expect the method disclosed here to be a viable and scalable alternative to thermal decomposition based CoFe2O4 synthesis. The magnetic trends reported will aid in the optimization of functional CoFe2O4 nanoparticles
Resumo:
During the early stages of operation, high-tech startups need to overcome the liability of newness and manage high degree of uncertainty. Several high-tech startups fail due to inability to deal with skeptical customers, underdeveloped markets and limited resources in selling an offering that has no precedent. This paper leverages the principles of effectuation (a logic of entrepreneurial decision making under uncertainty) to explain the journey from creation to survival of high-tech startups in an emerging economy. Based on the 99tests.com case study, this paper suggests that early stage high-tech startups in emerging economies can increase their probability of survival by adopting the principles of effectuation.
Resumo:
The rapid emergence of infectious diseases calls for immediate attention to determine practical solutions for intervention strategies. To this end, it becomes necessary to obtain a holistic view of the complex hostpathogen interactome. Advances in omics and related technology have resulted in massive generation of data for the interacting systems at unprecedented levels of detail. Systems-level studies with the aid of mathematical tools contribute to a deeper understanding of biological systems, where intuitive reasoning alone does not suffice. In this review, we discuss different aspects of hostpathogen interactions (HPIs) and the available data resources and tools used to study them. We discuss in detail models of HPIs at various levels of abstraction, along with their applications and limitations. We also enlist a few case studies, which incorporate different modeling approaches, providing significant insights into disease. (c) 2013 Wiley Periodicals, Inc.
Resumo:
In the current paper, we have primarily addressed one powerful simulation tool developed during the last decades-Large Eddy Simulation (LES), which is most suitable for unsteady three-dimensional complex turbulent flows in industry and natural environment. The main point in LES is that the large-scale motion is resolved while the small-scale motion is modeled or, in geophysical terminology, parameterized. With a view to devising a subgrid-scale(SGS) model of high quality, we have highlighted analyzing physical aspects in scale interaction and-energy transfer such as dissipation, backscatter, local and non-local interaction, anisotropy and resolution requirement. They are the factors responsible for where the advantages and disadvantages in existing SGS models come from. A case study on LES of turbulence in vegetative canopy is presented to illustrate that LES model is more based on physical arguments. Then, varieties of challenging complex turbulent flows in both industry and geophysical fields in the near future-are presented. In conclusion; we may say with confidence that new century shall see the flourish in the research of turbulence with the aid of LES combined with other approaches.
Resumo:
Tracking the evolution of research in waste recycling science (WRS) can be valuable for environmental agencies, as well as for recycling businesses. Maps of science are visual, easily readable representations of the cognitive structure of a branch of science, a particular area of research or the global spectrum of scientific production. They are generally built upon evidence collected from reliable sources of information, such as patent and scientific publication databases. This study uses the methodology developed by Rafols et al. (2010) to make a “double overlay map” of WRS upon a basemap reflecting the cognitive structure of all journal-published science, for the years 2005 and 2010. The analysis has taken into account the cognitive areas where WRS articles are published and the areas from where it takes its intellectual nourishing, paying special attention to the growing trends of the key areas. Interpretation of results lead to the conclusion that extraction of energy from waste will probably be an important research topic in the future, along with developments in general chemistry and chemical engineering oriented to the recovery of valuable materials from waste. Agricultural and material sciences, together with the combined economics, politics and geography field, are areas with which WRS shows a relevant and ever increasing cognitive relationship.
Resumo:
This document describes the analytical methods used to quantify core organic chemicals in tissue and sediment collected as part of NOAA’s National Status and Trends Program (NS&T) for the years 2000-2006. Organic contaminat analytical methods used during the early years of the program are described in NOAA Technical Memoranda NOS ORCA 71 and 130 (Lauenstein and Cantillo, 1993; Lauenstein and Cantillo, 1998) for the years 1984-1992 and 1993-1996, respectively. These reports are available from our website (http://www.ccma.nos.gov) The methods detailed in this document were utilized by the Mussel Watch Project and Bioeffects Project, which are both part of the NS&T program. The Mussel Watch Project has been monitoring contaminants in bivalves and sediments since 1986 and is the longest active national contaminant monitoring program operating in U.S. costal waters. Approximately 280 Mussel Watch sites are sampled on a biennial and decadal timescale for bivalve tissue and sediment respectively. Similarly, the Bioeffects Assessment Project began in 1986 to characterize estuaries and near coastal environs. Using the sediment quality triad approach that measures; (1) levels of contaminants in sediments, (2) incidence and severity of toxicity, and (3) benthic macrofaunal conmmunities, the Bioeffects Project describes the spatial extent of sediment toxicity. Contaminant assessment is a core function of both projects. These methods, while discussed here in the context of sediment and bivalve tissue, were also used with other matricies including: fish fillet, fish liver, nepheloid layer, and suspended particulate matter. The methods described herein are for the core organic contaminants monitored in the NS&T Program and include polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), butyltins, and organochlorines that have been analyzed consistently over the past 15-20 years. Organic contaminants such as dioxins, perfluoro compounds and polybrominated biphenyl ethers (PBDEs) were analyzed periodically in special studies of the NS&T Program and will be described in another document. All of the analytical techniques described in this document were used by B&B Laboratories, Inc, an affiliate of TDI-Brook International, Inc. in College Station, Texas under contract to NOAA. The NS&T Program uses a performance-based system approach to obtain the best possible data quality and comparability, and requires laboratories to demonstrate precision, accuracy, and sensitivity to ensure results-based performance goals and measures. (PDF contains 75 pages)
Resumo:
The spotted seatrout (Cynoscion nebulosus) is considered a key species relative to the implementation of the Comprehensive Everglades Restoration Plan (CERP). One of the goals of the CERP is to increase freshwater flows to Florida Bay. Increased freshwater flows can have potential positive and negative impacts on spotted seatrout populations. At low salinities, the planktonic eggs of spotted seatrout sink to the bottom and are not viable (Alshuth and Gilmore, 1994; Holt and Holt, 2002). On the other hand, increased freshwater flows can alleviate hypersaline conditions that could result in an expansion of the distribution of the early life stages of spotted seatrout (Thayer et al., 1999; Florida Department of Environmental Protection1). Thus it would be useful to develop a monitoring program that can detect changes in seatrout abundance on time scales short enough to be useful to resource managers. The NOAA Center for Coastal Fisheries and Habitat Research (NOAA) has made sporadic collections of juvenile seatrout using otter trawls since 1984 (see Powell et al, 2004). The results suggest that it might be useful to sample for seatrout in as many as eight different areas or basins (Figure 1): Bradley Key, Sandy Key, Johnson Key, Palm Key, Snake Bight, Central, Whipray and Crocodile Dragover. Unfortunately, logistical constraints are likely to limit the number of tows to about 40 per month over a period of six months each year. Inasmuch as few seatrout are caught in any given tow and the proportion of tows with zero seatrout is often high, it is important to determine how best to allocate this limited sampling effort among the various basins so that any trends in abundance may be detected with sufficient statistical confidence. (PDF contains 16 pages)
Resumo:
Inputs of toxic chemicals provide one of the major types of anthropogenic stress threatening our Nation's coastal and estuarine waters. To assess this threat, the National Oceanic and Atmospheric Administration's (NOAA’s) National Status and Trends (NS&T) Program Mussel Watch Project monitors the concentrations of more than 70 toxic chemicals in sediments and on the whole soft-parts of mussels and oysters at over 300 sites around the U.S. Twenty of the 25 designated areas that comprise NOAA's National Estuarine Research Reserve System (NERRS) have one or more Mussel Watch monitoring sites. Trace elements and organic contaminants were quantified including As, Ag, Cd, Cu, Hg, Ni, Pb, Zn, ΣPCBs, ΣPAHs, DDT and its metabolites, and butyltins. The Mussel Watch sites located in or near the 20 Reserves provide for both status and trends. Generally the Reserves have trace element and organic contaminant concentrations that are at or below the median concentration determined for all NS&T Mussel Watch monitoring data. Trends were derived using the Spearman-rank correlation coefficient. It was possible to determine if trends exist for sites at which six or more years of data are available. Generally no trends were found for trace elements but when trends were found they were usually decreasing. The same general conclusion holds for organic contaminants but more decreasing trends were found than for trace elements. The greatest number of decreasing trends were found for tributyltin and its metabolites. (PDF contains 203 pages)