Find Research Output

Research Output
  • All
  • Scholar Profiles
  • Research Units
  • Research Output
Filter
Department Publication Year Content Type Data Sources

SELECTED FILTERS

2014
EI
Clear all

1.Review on Non-Volatile Memory with High-k Dielectrics: Flash for Generation Beyond 32 nm

Author:Zhao, C;Zhao, CZ;Taylor, S;Chalker, PR

Source:MATERIALS,2014,Vol.7

Abstract:Flash memory is the most widely used non-volatile memory device nowadays. In order to keep up with the demand for increased memory capacities, flash memory has been continuously scaled to smaller and smaller dimensions. The main benefits of down-scaling cell size and increasing integration are that they enable lower manufacturing cost as well as higher performance. Charge trapping memory is regarded as one of the most promising flash memory technologies as further down-scaling continues. In addition, more and more exploration is investigated with high-k dielectrics implemented in the charge trapping memory. The paper reviews the advanced research status concerning charge trapping memory with high-k dielectrics for the performance improvement. Application of high-k dielectric as charge trapping layer, blocking layer, and tunneling layer is comprehensively discussed accordingly.

2.Light attenuation - a more effective basis for the management of fine suspended sediment than mass concentration?

Author:Davies-Colley, RJ;Ballantine, DJ;Elliott, SH;Swales, A;Hughes, AO;Gall, MP

Source:WATER SCIENCE AND TECHNOLOGY,2014,Vol.69

Abstract:Fine sediment continues to be a major diffuse pollution concern with its multiple effects on aquatic ecosystems. Mass concentrations (and loads) of fine sediment are usually measured and modelled, apparently with the assumption that environmental effects of sediment are predictable from mass concentrations. However, some severe impacts of fine sediment may not correlate well with mass concentration, notably those related to light attenuation by suspended particles. Light attenuation per unit mass concentration of suspended particulate matter in waters varies widely with particle size, shape and composition. Data for suspended sediment concentration, turbidity and visual clarity (which is inversely proportional to light beam attenuation) from 77 diverse New Zealand rivers provide valuable insights into the mutual relationships of these quantities. Our analysis of these relationships, both across multiple rivers and within individual rivers, supports the proposition that light attenuation by fine sediment is a more generally meaningful basis for environmental management than sediment mass. Furthermore, optical measurements are considerably more practical, being much cheaper (by about four-fold) to measure than mass concentrations, and amenable to continuous measurement. Mass concentration can be estimated with sufficient precision for many purposes from optical surrogates locally calibrated for particular rivers.

3.Emerging research on swarm intelligence and algorithm optimization

Author:Shi, Yuhui

Source:Emerging Research on Swarm Intelligence and Algorithm Optimization,2014,Vol.

Abstract:Throughout time, scientists have looked to nature in order to understand and model solutions for complex real-world problems. In particular, the study of self-organizing entities, such as social insect populations, presents a new opportunity within the field of artificial intelligence. Emerging Research on Swarm Intelligence and Algorithm Optimization discusses current research analyzing how the collective behavior of decentralized systems in the natural world can be applied to intelligent system design. Discussing the application of swarm principles, optimization techniques, and key algorithms being used in the field, this publication serves as an essential reference for academicians, upper-level students, IT developers, and IT theorists. © 2015 by IGI Global. All rights reserved.

4.One-class kernel subspace ensemble for medical image classification

Author:Zhang, YG;Zhang, BL;Coenen, F;Xiao, JM;Lu, WJ

Source:EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING,2014,Vol.2014

Abstract:Classification of medical images is an important issue in computer-assisted diagnosis. In this paper, a classification scheme based on a one-class kernel principle component analysis (KPCA) model ensemble has been proposed for the classification of medical images. The ensemble consists of one-class KPCA models trained using different image features from each image class, and a proposed product combining rule was used for combining the KPCA models to produce classification confidence scores for assigning an image to each class. The effectiveness of the proposed classification scheme was verified using a breast cancer biopsy image dataset and a 3D optical coherence tomography (OCT) retinal image set. The combination of different image features exploits the complementary strengths of these different feature extractors. The proposed classification scheme obtained promising results on the two medical image sets. The proposed method was also evaluated on the UCI breast cancer dataset (diagnostic), and a competitive result was obtained.

5.A novel classifier ensemble method with sparsity and diversity

Author:Yin, XC;Huang, KZ;Hao, HW;Iqbal, K;Wang, ZB

Source:NEUROCOMPUTING,2014,Vol.134

Abstract:We consider the classifier ensemble problem in this paper. Due to its superior performance to individual classifiers, class ensemble has been intensively studied in the literature. Generally speaking, there are two prevalent research directions on this, i.e., to diversely generate classifier components, and to sparsely combine multiple classifiers. While most current approaches are emphasized on either sparsity or diversity only, we investigate the classifier ensemble by learning both sparsity and diversity simultaneously. We manage to formulate the classifier ensemble problem with the sparsity or/and diversity learning in a general framework. In particular, the classifier ensemble with sparsity and diversity can be represented as a mathematical optimization problem. We then propose a heuristic algorithm, capable of obtaining ensemble classifiers with consideration of both sparsity and diversity. We exploit the genetic algorithm, and optimize sparsity and diversity for classifier selection and combination heuristically and iteratively. As one major contribution, we introduce the concept of the diversity contribution ability so as to select proper classifier components and evolve classifier weights eventually. Finally, we compare our proposed novel method with other conventional classifier ensemble methods such as Bagging, least squares combination, sparsity learning, and AdaBoost, extensively on UCI benchmark data sets and the Pascal Large Scale Learning Challenge 2008 webspam data. The experimental results confirm that our approach leads to better performance in many aspects. (C) 2014 Elsevier B.V. All rights reserved.

6.Super transverse diffusion of minority carriers in GaxIn1-xP/GaAs double-junction tandem solar cells

Author:Deng, Z. ; Wang, R.X. ; Ning, J.Q. ; Zheng, C.C. ; Xu, S.J. ; Xing, Z. ; Lu, S.L. ; Dong, J.R. ; Zhang, B.S. ; Yang, H.

Source:Solar Energy,2014,Vol.110

Abstract:In this work, remarkable transverse diffusion of minority carriers in the GaxIn1-xP top subcell of a GaxIn1-xP/GaAs double-junction tandem solar cell is revealed by the electroluminescence (EL) image surveying. As the forward bias is increased, the overall EL intensity rapidly increases, but the topographical distribution of lateral intensity becomes more uneven. By analyzing the relation between the measured EL emission intensity and diffusion parameters of electrically injected minority carriers, the transverse diffusion length of the minority carriers is determined to be ~93μm at the forward bias of 2.75V, which is 30 times larger than that of unbiased GaxIn1-xP single layer. Possible influence of such super diffusion of charge carriers on the conversion efficiency of tandem solar cells is discussed. © 2014 Elsevier Ltd.

7.Determine the Permittivity of the Plastic Materials

Author:Lim, EG;Wang, Z;Leach, MP;Gray, D;Man, KL;Zhang, N

Source:2014 INTERNATIONAL SYMPOSIUM ON COMPUTER, CONSUMER AND CONTROL (IS3C 2014),2014,Vol.

Abstract:Microwave dielectric measurements are difficult to make, each method involves a compromise between accuracy, experimental simplicity, and complexity of the analysis. It is with these ideas in mind that we present a technique in which a waveguide is completely filled with the dielectric of interest to determine the dielectric constant for plastic materials (e.g. Rexolite, Lacqrene and PTFE). Good agreements between measured and manufacturer material specifications have been obtained.

8.Methodological approaches for studying the microbial ecology of drinking water distribution systems

Author:Douterelo, I;Boxall, JB;Deines, P;Sekar, R;Fish, KE;Biggs, CA

Source:WATER RESEARCH,2014,Vol.65

Abstract:The study of the microbial ecology of drinking water distribution systems (DWDS) has traditionally been based on culturing organisms from bulk water samples. The development and application of molecular methods has supplied new tools for examining the microbial diversity and activity of environmental samples, yielding new insights into the microbial community and its diversity within these engineered ecosystems. In this review, the currently available methods and emerging approaches for characterising microbial communities, including both planktonic and biofilm ways of life, are critically evaluated. The study of biofilms is considered particularly important as it plays a critical role in the processes and interactions occurring at the pipe wall and bulk water interface. The advantages, limitations and usefulness of methods that can be used to detect and assess microbial abundance, community composition and function are discussed in a DWDS context. This review will assist hydraulic engineers and microbial ecologists in choosing the most appropriate tools to assess drinking water microbiology and related aspects. (C) 2014 The Authors. Published by Elsevier Ltd.

9.Empirical Analysis of Chirp and Multitones Performances with a UWB Software Defined Radar: Range, Distance and Doppler

Author:Le Kernec, J;Gray, D;Romain, O

Source:PROCEEDINGS OF 2014 3RD ASIA-PACIFIC CONFERENCE ON ANTENNAS AND PROPAGATION (APCAP 2014),2014,Vol.

Abstract:In this study, a protocol for an unbiased analysis of radar signals' performance. Using a novel UWB software-defined radar, range profile, Doppler profile and detection range are evaluated for both Linear Frequency Modulated pulse and Multitones. The radar was prototyped and is comparable in overall performance to software defined radar test-beds found in the literature. The measured performance was in agreement with the simulations.

10.Population diversity of particle swarm optimizer solving single-and multi-objective problems

Author:Cheng, Shi ; Shi, Yuhui ; Qin, Quande

Source:Emerging Research on Swarm Intelligence and Algorithm Optimization,2014,Vol.

Abstract:Premature convergence occurs in swarm intelligence algorithms searching for optima. A swarm intelligence algorithm has two kinds of abilities exploration of new possibilities and exploitation of old certainties. The exploration ability means that an algorithm can explore more search places to increase the possibility that the algorithm can find good enough solutions. In contrast, the exploitation ability means that an algorithm focuses on the refinement of found promising areas. An algorithm should have a balance between exploration and exploitation, that is, the allocation of computational resources should be optimized to ensure that an algorithm can find good enough solutions effectively. The diversity measures the distribution of individuals' information. From the observation of the distribution and diversity change, the degree of exploration and exploitation can be obtained. Another issue in multiobjective is the solution metric. Pareto domination is utilized to compare two solutions; however, solutions are almost Pareto non-dominated for multiobjective problems with more than ten objectives. In this chapter, the authors analyze the population diversity of a particle swarm optimizer for solving both single objective and multiobjective problems. The population diversity of solutions is used to measure the goodness of a set of solutions. This metric may guide the search in problems with numerous objectives. Adaptive optimization algorithms can be designed through controlling the balance between exploration and exploitation. © 2015 by IGI Global. All rights reserved.

11.A cognitonics approach to computer supported learning in the Mexican state of Oaxaca

Author:Craig, Paul ; Roa-Seïler, Néna ; Díaz, Marcela Martínez ; Rosano, Felipe Lara

Source:Informatica (Slovenia),2014,Vol.38

Abstract:Cognitonics is a new science which looks at ways to reconcile human socio-spiritual development with increasingly rapid human intellectual development in the new context of technological advances and increased cultural homogeny. This is particularly relevant in areas such as education and informatics where children are found to be increasingly capable to control and adapt to new technological advances yet often suffer from a lack of social development or are unable to engage with aspects of their own cultural heritage. In this study we consider the application of a cognitonics based approach to the problems of the Oaxacan education system, particularly for indigenous children who suffer from a loss of culture and diminished provision of education due to a lack of resources and regular teacher strikes. Specifically, we look at how the introduction of face-to-face collaborative video games can help develop academic, information-technology and social skills together while promoting spiritual well-being and cultural identity.

12.Using learning analytics to analyze writing skills of students A case study in a technological common core curriculum course

Author:Lei, Chi-Un ; Man, Ka Lok ; Ting, T.O.

Source:IAENG International Journal of Computer Science,2014,Vol.41

Abstract:Pedagogy with learning analytics is shown to facilitate the teaching-learning process through analyzing student's behaviours. In this paper, we explored the possibility of using learning analytics tools Coh-Metrix and Lightside for analyzing and improving writing skills of students in a technological common core curriculum course. In this study, we i) investigated linguistic characteristics of student's essays, and ii) applied a machine learning algorithm for giving instant sketch feedback to students. Results illustrated the necessity of improving student's writing skills in their university learning through elearning technologies, so that students can effectively circulate their ideas to the public in the future.

13.Promotion-based input partitioning of neural network

Author:Guo, Shujuan ; Guan, Sheng-Uei ; Li, Weifan ; Zhao, Linfan ; Song, Jinghao ; Cao, Mengying

Source:Lecture Notes in Electrical Engineering,2014,Vol.272 LNEE

Abstract:To improve the learning performance and precision of neural network, this paper introduces an input-attribute partitioning algorithm with an aim to increase the promotion among them. If a better performance could be obtained by training some attributes together, it is considered that there is positive effect among these attributes. It is assumed that by putting attributes, among which there are positive effect, a lower error can be obtained. After partitioning, multiple learners were employed to tackle each group. The final result is obtained by integrating the result of each learner. It turns out that, this algorithm actually can reduce the classification error in supervised learning of neural network. © Springer-Verlag Berlin Heidelberg 2014.

14.DERRT Disastrous emergency response robot team for cooperative rescue

Author:Law, Nim Ying ; Kwong, Yiu Choi ; Lee, Jeffrey Jun Qi ; Kwok, Kwun Hang ; Lam, Zhao Lang ; Yue, Yong ; Man, Ka Lok ; Lei, Chi-Un

Source:Lecture Notes in Engineering and Computer Science,2014,Vol.2210

Abstract:Even though different technologies have been invented to predict natural disasters, we cannot avoid people getting injured or dying. Every time, after a natural disaster, although lots of rescue teams from different parties have been involved in relieving the victims, there are still so many casualties. Therefore, we have proposed a cooperative rescue robot system, called Disastrous Emergency Response Robot Team (DERRT). DERRT is a team of cooperative robots integrated with a great variety of equipment for the sake of saving lives. Basically, our team consists of 4 types of robots, named as "Coordinator", "Crusher", "Saver" and "Lifter". They work collaboratively, which depends on the kind of situation that they face. In this positioning paper, applied technologies, limitations as well as possible extensions of the system are fully discussed.

15.Business buddy Connecting you and your business partners

Author:Kwok, Kenneth Kin Pong ; Tsui, Ho Wang ; Yang, Yichen ; Chan, Wai Lun ; Man, Ka Lok ; Lei, Chi-Un

Source:Lecture Notes in Engineering and Computer Science,2014,Vol.2210

Abstract:Keeping record of schedule and administrative information and analyzing it can be demanding in both time and mental ways. With the view to overcoming the obstacles and even provide a better service, we propose a schedule and administration management tool, called Business Buddy. With the help of voice recognition, dictation, cloud system and language analysis, the tool can keep record of the information of the business partners met in meetings. Features, technical implementation and future development are described in this positioning paper.

16.Uniform point sets and the collision test

Author:Goncu, A;Okten, G

Source:JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS,2014,Vol.259

Abstract:Monte Carlo and quasi-Monte Carlo methods are popular numerical tools used in many applications. The quality of the pseudorandom sequence used in a Monte Carlo simulation is essential to the accuracy of its estimates. Likewise, the quality of the low-discrepancy sequence determines the accuracy of a quasi-Monte Carlo simulation. There is a vast literature on statistical tests that help us assess the quality of a pseudorandom sequence. However, for low-discrepancy sequences, assessing quality by estimating discrepancy is a very challenging problem, leaving us with no practical options in very high dimensions. In this paper, we will discuss how a certain interpretation of the well-known collision test for pseudorandom sequences can be used to obtain useful information about the quality of low-discrepancy sequences. Numerical examples will be used to illustrate the applications of the collision test. (C) 2013 Elsevier B.V. All rights reserved.

17.Graphical lasso quadratic discriminant function and its application to character recognition

Author:Xu, B;Huang, KZ;King, I;Liu, CL;Sun, J;Satoshi, N

Source:NEUROCOMPUTING,2014,Vol.129

Abstract:Multivariate Gaussian distribution is a popular assumption in many pattern recognition tasks. The quadratic discriminant function (QDF) is an effective classification approach based on this assumption. An improved algorithm, called modified QDF (or MQDF in short) has achieved great success and is widely recognized as the state-of-the-art method in character recognition. However, because both of the two approaches estimate the mean and covariance by the maximum-likelihood estimation (MLE), they often lead to the loss of the classification accuracy when the number of the training samples is small. To attack this problem, in this paper, we engage the graphical lasso method to estimate the covariance and propose a new classification method called the graphical lasso quadratic discriminant function (GLQDF). By exploiting a coordinate descent procedure for the lasso, GLQDF can estimate the covariance matrix (and its inverse) more precisely. Experimental results demonstrate that the proposed method can perform better than the competitive methods on two artificial and nine real datasets (including both benchmark digit and Chinese character data). (C) 2013 Elsevier B.V. All rights reserved.

18.Water quality trends in New Zealand rivers: 1989-2009

Author:Ballantine, DJ;Davies-Colley, RJ

Source:ENVIRONMENTAL MONITORING AND ASSESSMENT,2014,Vol.186

Abstract:Recent assessments of water quality in New Zealand have indicated declining trends, particularly in the 40 %% of the country's area under pasture. The most comprehensive long-term and consistent water quality dataset is the National Rivers Water Quality Network (NRWQN). Since 1989, monthly samples have been collected at 77 NRWQN sites on 35 major river systems that, together, drain about 50 %% of New Zealand's land area. Trend analysis of the NRWQN data shows increasing nutrient concentrations, particularly nitrogen (total nitrogen and nitrate), over 21 years (1989-2009). Total nitrogen and nitrate concentrations were increasing significantly over the first 11 years (1989-2000), but for the more recent 10-year period, only nitrate concentrations continued to increase sharply. Also, the increasing phosphorus trends over the first 11 years (1989-2000) levelled off over the later 10-year period (2000-2009). Conductivity has also increased over the 21 years (1989-2009). Visual clarity has increased over the full time period which may be the positive result of soil conservation measures and riparian fencing. NRWQN data shows that concentrations of nutrients increase, and visual clarity decreases (i.e. water quality declines), with increasing proportions of pastoral land in catchments. As such, the increasing nutrient trends may reflect increasing intensification of pastoral agriculture.

19.Large-scale global optimization via swarm intelligence

Author:Cheng, Shi ; Ting, T.O. ; Yang, Xin-She

Source:Springer Proceedings in Mathematics and Statistics,2014,Vol.97

Abstract:Large-scale global optimization (LSGO) is a challenging task with many scientific and engineering applications. Complexity, nonlinearity and size of the problems are the key factors that pose significant challenges in solving such problems. Though the main aim of optimization is to obtain the global optimal solutions with the least computational costs, it is impractical in most applications. Thus, a practical approach is to search for suboptimal solutions and good solutions, which may not be easily achievable for large-scale problems. In this chapter, the challenges posed by LSGO are addressed, followed by some potential strategies to overcome these difficulties. We also discuss some challenging topics for further research. © Springer International Publishing Switzerland 2014.

20.Learning Locality Preserving Graph from Data

Author:Zhang, YM;Huang, KZ;Hou, XW;Liu, CL

Source:IEEE TRANSACTIONS ON CYBERNETICS,2014,Vol.44

Abstract:Machine learning based on graph representation, or manifold learning, has attracted great interest in recent years. As the discrete approximation of data manifold, the graph plays a crucial role in these kinds of learning approaches. In this paper, we propose a novel learning method for graph construction, which is distinct from previous methods in that it solves an optimization problem with the aim of directly preserving the local information of the original data set. We show that the proposed objective has close connections with the popular Laplacian Eigenmap problem, and is hence well justified. The optimization turns out to be a quadratic programming problem with n(n -1)/2 variables (n is the number of data points). Exploiting the sparsity of the graph, we further propose a more efficient cutting plane algorithm to solve the problem, making the method better scalable in practice. In the context of clustering and semi-supervised learning, we demonstrated the advantages of our proposed method by experiments.
Total 98 results found
Copyright 2006-2020 © Xi'an Jiaotong-Liverpool University 苏ICP备07016150号-1 京公网安备 11010102002019号