Projects
Research & Publications
Authors:
Dong, N., Qin, M., Chang, J., Wu, C. H., Ip, W. ., & Yung, K. .
Abstract:
Smart living is an emerging technology that has attracted a lot of attention all around the world. As a key technology of smart space, which is the principal part of smart living, the SLAM system has effectively expanded the ability of space intelligent robots to explore unknown environments. Loop closure detection is an important part of SLAM system and plays a very important role in eliminating cumulative errors. The SLAM system without loop closure detection is degraded to an odometer. The state estimation solely relying on an odometer will be seriously deviated in the long-term and large-scale navigation and positioning. This paper proposes a metric learning method that uses deep neural networks for loop closure detection based on triplet loss. The map points obtained by metric learning are fused with all map points in the current keyframe, and the map points that do not meet the filtering conditions are eliminated. Based on the Batch Hard Triplet loss, the weighted triplet loss function avoids suboptimal convergence in the learning process by applying weighted value constraints. At the same time, considering that fixed boundary parameters cannot be well adapted to the diversity of scales between different samples, we use the semantic similarity of anchor samples and negative samples to redefine boundary parameters. Finally, a SLAM system based on metric learning is constructed, and the SLAM dataset TUM and KITTI are used to evaluate the proposed model’s accuracy rate and recall rate. The scene features in this method are extracted automatically through neural networks instead of being artificially set. Finally, a high-precision closed-loop detection method based on weight adaptive triple loss is effectively realised through the closed-loop detection experiment. The minimum relative pose error is 0.00048 m, which is 15.8% less than that of the closed-loop detection algorithm based on the word bag model.
Authors:
Mo, D. Y., Tang, Y. M., Wu, E. Y., & Tang, V.
Abstract:
Electronic assessment (e-assessment) is an essential part of higher education, not only used to manage a large class size of students’ learning performance and particularly in assessing the learning outcomes of students. The e-assessment data generated can not only be used to determine students’ study weaknesses to develop strategies for teaching and learning, but also in the development of essential teaching and learning pedagogies for online teaching and learning. Despite the wider adoption of Information and Communication Technology (ICT) technologies due to the COVID-19 pandemic, universities still encountered numerous problems during the transformation to electronic teaching as most educators struggled with the effective implementation of the Electronic Assessment System (EAS). The successful launch of EAS relied heavily on students’ use intention towards the new and unfamiliar electronic system, which was actually unknown to the project managers of EAS. It is therefore important to understand students’ views and concerns on EAS and the proactive measures taken by universities to enhance students’ acceptance and intention of usage. Although most studies investigate students’ acceptance of online learning, there is still little research on the adoption of e-assessment. In this regard, we propose to develop a theoretical model based on students’ perceptions of EAS. Based on the Technology Acceptance Model (TAM) and a major successor of TAM, an electronic assessment system acceptance model (EASA model) is developed with key measures including system adoption anxiety, e-assessment facilitation, risk reduction amid, etc. The data is obtained through a survey among current students at a local university, and structural equation modeling (SEM) is applied to analyze the quantitative data. This study has a significant impact on improving educators’ use of e-assessment in order to develop essential online teaching and learning pedagogy in the future.
Authors:
Mo, D. Y., Wang, Y., Ho, D. C. K., & Leung, K. H.
Abstract:
Service parts management has the potential to generate high profits for companies that deliver superior service parts services in the after-sale market. However, a big challenge in managing service parts operations is to meet the high expectations of service levels and to reduce excess inventories caused by fluctuating demand and a complex service parts logistics network structure. By expanding the conventional inventory management that passively focuses on the forward and lateral flows of service parts deployment, we propose a crucial but overlooked practice of inventory redeployment as an integral part of the operations that allow the proactive management of lateral and reverse flows of service parts. We formulate the service parts inventory problem with the application of an excess inventory redeployment strategy in a multi-echelon service network as a multi-period integer programming model. This optimisation model is evaluated using a case study of an international company’s service parts operations and demonstrates a higher cost-saving potential. Our novel, integrated approach confers the advantage of redeploying excess inventories in a closed-loop service parts logistics network with a higher cost-saving potential that could not have been achieved in a conventional approach.
Authors:
Tang V., Lam H.Y., Wu C.H. and Ho G.T.S.
Abstract:
Due to the increasing ageing population, how can caregivers effectively provide long-term care services to meet the older adults’ needs with finite resources is emerging. In addressing this issue, nursing homes are striving to adopt smart health with the internet of things and artificial intelligence to improve the efficiency and sustainability of healthcare. This study proposed a two-echelon responsive health analytic model (EHAM) to deliver appropriate healthcare services in nursing homes under the Internet of Medical Things environment. A novel care plan revision index is developed using a dual fuzzy logic approach for multidimensional health assessments, followed by care plan modification using case-based reasoning. The findings reveal that EHAM can generate patient-centred long-term care solutions of high quality to maximise the satisfaction of nursing home residents and their families. Ultimately, sustainable healthcare services can be within the communities.
Authors:
Tang Y.M., Ho G.T.S., Lau Y.Y. and Tsui S.Y.
Abstract:
In the context of the global economic slowdown, demand forecasting, and inventory and production management have long been important topics to the industries. With the support of smart warehouses, big data analytics, and optimization algorithms, enterprises can achieve economies of scale, and balance supply and demand. Smart warehouse and manufacturing management is considered the culmination of recently advanced technologies. It is important to enhance the scalability and extendibility of the industry. Despite many researchers having developed frameworks for smart warehouse and manufacturing management for various fields, most of these models are mainly focused on the logistics of the product and are not generalized to tackle the specific manufacturing problem facing in the cyclical industry. Indeed, the cyclical industry has a key problem: the big risk which high sensitivity poses to the business cycle and economic recession, which is difficult to foresee. Despite many inventory optimization approaches being proposed to optimize the inventory level in the warehouse and facilitate production management, the demand forecasting technique is seldom focused on the cyclic industry. On the other hand, management approaches are usually based on the complex logistics process instead of integrating the inventory level of the stock, which is very crucial to composing smart warehouses and manufacturing. This research study proposed a digital twin framework by integrating the smart warehouse and manufacturing with the roulette genetic algorithm for demand forecasting in the cyclical industry. We also demonstrate how this algorithm is practically implemented for forecasting the demand, sustaining manufacturing optimization, and achieving inventory optimization. We adopted a small-scale textile company case study to demonstrate the proposed digital framework in the warehouse and demonstrate the results of demand forecasting and inventory optimization. Various scenarios were conducted to simulate the results for the digital twin. The proposed digital twin framework and results help manufacturers and logistics companies to improve inventory management. This study has important theoretical and practical significance for the management of the cyclical industry.
Authors:
Tsang, Y. P., Wu, C. H., Lin, K. Y., Tse, Y. K., Ho, G. T. S., & Lee, C. K. M.
Abstract:
New product development to enhance companies’ competitiveness and reputation is one of the leading activities in manufacturing. At present, achieving successful product design has become more difficult, even for companies with extensive capabilities in the market, because of disorganisation in the fuzzy front end (FFE) of the innovation process. Tremendous amounts of information, such as data on customers, manufacturing capability, and market trend, are considered in the FFE phase to avoid common flaws in product design. Because of the high degree of uncertainties in the FFE, multidimensional and high-volume data are added from time to time at the beginning of the formal product development process. To address the above concerns, deploying big data analytics to establish industrial intelligence is an active but still under-researched area. In this paper, an intelligent product design framework is proposed to incorporate fuzzy association rule mining (FARM) and a genetic algorithm (GA) into a recursive association-rule-based fuzzy inference system to bridge the gap between customer attributes and design parameters. Considering the current incidence of epidemics, such as the COVID-19 pandemic, communication of information in the FFE stage may be hindered. Through this study, a recursive learning scheme is established, therefore, to strengthen market performance, design performance, and sustainability on product design. It is found that the industrial big data analytics in the FFE process achieve greater flexibility and self-improvement mechanism on the evolution of product design.
Authors:
Dong, N., Zhai, M. D., Chang, J. F., & Wu, C. H.
Abstract:
As important immune cells in the human body, white blood cells play a very significant role in the auxiliary diagnosis of many major diseases. Clinically, changes in the number and morphology of white blood cells and their subtypes are the prediction index for important, serious diseases, such as anaemia, malaria, infections, and tumours. The application of image recognition technology and cloud computing to assist in medical diagnosis is a hot topic in current research, which we believe have great potential to further improve real-time detection and improve medical diagnosis. This paper proposes a novel automatic classification framework for the recognition of five subtypes of white blood cells, in the hope of contributing to disease prediction. First, we present an adaptive threshold segmentation method to deal with blood smear images with nonuniform colour and uneven illumination. The method is designed based on colour space information and threshold segmentation. After successfully separating the white blood cell from the blood smear image, a large number of features, including geometrical, colour, and texture features are extracted. However, redundant features can affect the classification speed and efficiency, and in view of that, a feature selection algorithm based on classification and regression trees (CART) is designed to successfully remove irrelevant and redundant features from the initial features. The selected prominent features are fed into a particle swarm optimisation support vector machine (PSO-SVM) classifier to recognise the types of white blood cells. Finally, to evaluate the performance of the proposed white blood cell classification methodology, we build a white blood cell data set containing 500 blood smear images for experiments. The proposed methodology achieves 99.76% classification accuracy, which well demonstrates its effectiveness.
Authors:
Lam, H. Y., Tsang, Y. P., Wu, C. H., & Tang, V.
Abstract:
For companies to gain competitive advantage, an effective customer relationship management (CRM) approach is necessary. Based on customer purchase behaviour and ordering patterns, companies can be classified into different categories in terms of providing customised sales and promotions for customers. However, companies that lack an effective CRM strategy can only offer the same sales and marketing strategies to all customers. Furthermore, the traditional approach to managing customers is control via a centralised method, in which the information regarding customer segmentation is not shared among the customer network. Consequently, valuable customers may be neglected, resulting in the loss of customer loyalty and sales orders, and the weakening of trust in the customer–company relationship. This paper designs an integrated data analytic model (IDAM) in a peer-to-peer cloud, integrating RFM-based k-means clustering algorithm, analytical hierarchy processing and fuzzy logic to divide customers into different segments and hence formulate a customised sales strategy. A pilot study of IDAM is conducted in a trading company specialised in providing advanced manufacturing technology to demonstrate how IDAM can be applied to formulate an effective sales strategy to attract customers. Overall, this study explores the effective deployment of CRM into the peer-to-peer cloud so as to facilitate sales strategy formulation and trust between customers and companies in the network.
Authors:
Long, W., Wu, C. H., Tsang, Y. P., & Chen, Q.
Abstract:
Pallet pooling is regarded as a sustainable and cost-effective measure for the industry, but challenging to advocate due to weak data and pallet authentication.In order to establish trust between end-users and pallet pooling services, we propose an end-to-end, bidirectional authentication system for transmitted data and pallets based on blockchain and Internet-of-things (IoT) technologies. In addition, secure data authentication fosters the pallet authenticity in the whole supply chain network, which is achieved by considering the tag, location, and object-specific features. To evaluate the object-specific features, the scale invariant feature transform (SIFT) approach is adopted to match key-points and descriptors between two pallet images. According to the case study, it is found that the proposed system provides a low bandwidth blocking rate and a high probability of restoring complete data payloads. Consequently, positive influences on end-user satisfaction, quality of service, operational errors, and pallet traceability are achieved through the deployment of the proposed system.
Authors:
Tsang, Y. P., Wu, C. H., Ip, W. H., & Shiau, W. L.
Abstract:
Purpose
Due to the rapid growth of blockchain technology in recent years, the fusion of blockchain and the Internet of Things (BIoT) has drawn considerable attention from researchers and industrial practitioners and is regarded as a future trend in technological development. Although several authors have conducted literature reviews on the topic, none have examined the development of the knowledge structure of BIoT, resulting in scattered research and development (R&D) efforts.
Design/methodology/approach
This study investigates the intellectual core of BIoT through a co-citation proximity analysis–based systematic review (CPASR) of the correlations between 44 highly influential articles out of 473 relevant research studies. Subsequently, we apply a series of statistical analyses, including exploratory factor analysis (EFA), hierarchical cluster analysis (HCA), k-means clustering (KMC) and multidimensional scaling (MDS) to establish the intellectual core.
Findings
Our findings indicate that there are nine categories in the intellectual core of BIoT: (1) data privacy and security for BIoT systems, (2) models and applications of BIoT, (3) system security theories for BIoT, (4) frameworks for BIoT deployment, (5) the fusion of BIoT with emerging methods and technologies, (6) applied security strategies for using blockchain with the IoT, (7) the design and development of industrial BIoT, (8) establishing trust through BIoT and (9) the BIoT ecosystem.
Originality/value
We use the CPASR method to examine the intellectual core of BIoT, which is an under-researched and topical area. The paper also provides a structural framework for investigating BIoT research that may be applicable to other knowledge domains.
Authors:
Wang, T., Zuo, H., Wu, C. H., & Hu, B.
Abstract:
The estimation of the difference between the new competitive advantages of China’s export and the world’s trading powers have been the key measurement problems in China-related studies. In this work, a comprehensive evaluation index system for new export competitive advantages is developed, a soft-sensing model for China’s new export competitive advantages based on the fuzzy entropy weight analytic hierarchy process is established, and the soft-sensing values of key indexes are derived. The obtained evaluation values of the main measurement index are used as the input variable of the fuzzy least squares support vector machine, and a soft-sensing model of the key index parameters of the new export competitive advantages of China based on the combined soft-sensing model of the fuzzy least squares support vector machine is established. The soft-sensing results of the new export competitive advantage index of China show that the soft measurement model developed herein is of high precision compared with other models, and the technical and brand competitiveness indicators of export products have more significant contributions to the new competitive advantages of China’s export, while the service competitiveness indicator of export products has the least contribution to new competitive advantages of China’s export.
Authors:
Tsang, Y. P., Choy, K .L., Wu, C.H., Ho, G.T.S.
Abstract:
Effective deployment of the emerging environmental sensor network in environmental mapping has become essential in numerous industrial applications. The essential factors for deployment include cost, coverage, connectivity, airflow of heating, ventilation, and air conditioning, system lifetime, and fault tolerance. In this letter, a three-stage deployment scheme is proposed to formulate the above-mentioned considerations, and the fuzzy temperature window is established to adjust sensor activation times over various ambient temperatures. To optimize the deployment effectively, a multi-response Taguchi-guided k-means clustering is proposed to embed in the genetic algorithm, where an improved set of the initial population is formulated and system parameters are optimized. Therefore, the computational time for repeated deployment is shortened, while the solution convergence can be improved.
Authors:
Ho, G.T.S. , Tsang, Y.P., Wu, C.H., Wong, W.H., Choy, K.L.
Abstract:
In digital and green city initiatives, smart mobility is a key aspect of developing smart cities and it is important for built-up areas worldwide. Double-parking and busy roadside activities such as frequent loading and unloading of trucks, have a negative impact on traffic situations, especially in cities with high transportation density. Hence, a real-time internet of things (IoT)-based system for surveillance of roadside loading and unloading bays is needed. In this paper, a fully integrated solution is developed by equipping high-definition smart cameras with wireless communication for traffic surveillance. Henceforth, this system is referred to as a computer vision-based roadside occupation surveillance system (CVROSS). Through a vision-based network, real-time roadside traffic images, such as images of loading or unloading activities, are captured automatically. By making use of the collected data, decision support on roadside occupancy and vacancy can be evaluated by means of fuzzy logic and visualized for users, thus enhancing the transparency of roadside activities. The CVROSS was designed and tested in Hong Kong to validate the accuracy of parking-gap estimation and system performance, aiming at facilitating traffic and fleet management for smart mobility.
Authors:
Mo, D. Y., Ng, S. C. H., Tai, David.
Abstract:
This study demonstrates how NetApp, a data storage system provider, used Six Sigma to solve the service parts inventory problem in its multiechelon logistics network, which its inventory management system was unable to fix. The nonstationary demand for service parts created a blind spot for the system, thus hampering NetApp’s contractual commitment to customers of an almost 100% fill rate (FR) for replacing service parts. Constant customer complaints because of FRs that were less than 100% caused NetApp to improve the performance of its service parts replenishment and order fulfillment processes. By following the Six Sigma approach and using the associated qualitative and quantitative tools, the company worked systemically to identify the major causes of insufficient stock and systematically corrected the problem. NetApp formulated a cost-effective inventory solution for its inventory planning system, which resulted in a 10% decrease in the ratio of inventory to revenue and an FR increase from 99.1% to 99.6%. The standard deviation of the replenishment lead time also declined from 4.97 to 1.87 days, implying that the variation of the replenishment lead time was greatly reduced. The Six Sigma process, therefore, provided new insights and a new approach to enable NetApp to manage its inventory planning process.
Authors:
Lo, W. H., Lam, B. S. Y., Cheung , M. F.
Abstract:
This article examines the news framing of the 2017 Hong Kong Chief Executive election using a big data analysis approach. Analyses of intermedia framing of over 370,000 articles and comments are conducted including news published in over 30 Chinese press media, four prominent Chinese online press media, and posts published on three candidates’ Facebook pages within the election period. The study contributes to the literature by examining the rarely discussed role of intermedia news framing, especially the relationship between legacy print media, online alternative news media, and audience comments on candidates’ social network sites. The data analysis provides evidence that audiences’ comments on candidates’ Facebook pages influenced legacy news coverage and online alternative news coverage. However, this study suggests that legacy news media and comments on Facebook do not necessarily have a reciprocal relationship. The implication of the findings and limitations are discussed.
Authors:
Lam, B. S. Y., Choy , S. K.
Abstract:
Different versions of principal component analysis (PCA) have been widely used to extract important information for image recognition and image clustering problems. However, owing to the presence of outliers, this remains challenging. This paper proposes a new PCA methodology based on a novel discovery that the widely used -PCA is equivalent to a two-groups -means clustering model. The projection vector of the -PCA is the vector difference between the two cluster centers estimated by the clustering model. In theory, this vector difference provides inter-cluster information, which is beneficial for distinguishing data objects from different classes. However, the performance of -PCA is not comparable with the state-of-the-art methods. This is because the -PCA can be sensitive to outliers, as the equivalent clustering model is not robust to outliers. To overcome this limitation, we introduce a trimming function to the clustering model and propose a trimmed-clustering based -PCA (TC-PCA). With this trimming set formulation, the TC-PCA is not sensitive to outliers. Besides, we mathematically prove the convergence of the proposed algorithm. Experimental results on image classification and clustering indicate that our proposed method outperforms the current state-of-the-art methods.
Authors:
Xu, L., Tang, M. L., Chen, Z.
Abstract:
In longitudinal data analysis, it is crucial to understand the dynamic of the covariance matrix of repeated measurements and correctly model it in order to achieve efficient estimators of the mean regression parameters. It is well known that any incorrect covariance matrices can result in inefficient estimators of the mean regression parameters. In this article, we propose an empirical likelihood based method which combines the advantages of different dynamic covariance modeling approaches. The effectiveness of the proposed approach is demonstrated by an anesthesiology dataset and some simulation studies.
Ongoing Projects
Principal Investigator:
Dr. HO To Sum, George
Abstract:
The next era of information and communication technology, namely digital transformation, attracts considerable attention from industrial practitioners to make unprecedented changes for businesses. In Hong Kong, over 340,000 small and medium enterprises (SMEs) are contributing more than 98% of the total business units with 45% of total employment. However, they have always faced challenges regarding resource shortages, lack of talent and poor performance measurement. In business process management, most tasks in SMEs are still labor-intensive without effective technological support or automation.
To improve the above situation this project proposes the Smart Robotic Workforce System, which is a digital twin-based solution for business process re-engineering. The business processes in the physical world are transformed onto the digital world using Internet of Things and physical internet technologies. To facilitate the deployment of robotic process automation (RPA) in SMEs, artificial intelligence techniques are utilized as an intelligent agent to learn from case scenarios and to recommend appropriate RPA formulations. In addition, data mining and multi-criteria decision-making approaches are integrated to measure performance and resource allocation of process robots in SMEs. Through automating business processes, a driving force for re-industrialization and a growth engine for a future economy can be achieved.
Principal Investigator:
Dr. WU Chun-ho
Abstract:
There are different learning approaches for students to improve their English proficiency, such as providing online English materials and consultation for students at the University. However, most of the approaches are lacking systematic approaches to monitor and assess training and improving the path of students. Teachers may only rely on assignments and examination results to evaluate student performance in a particular subject. In such situations, it is difficult for teachers to identify problems of an individual student in speaking English and hence provide suggestions for students to prevent making repetitive mistakes. In such a situation, students may still use “Engrish” to learn other subject domain knowledge and result in poor subject performance. Based on the above problems, this project aims to develop an interactive artificial intelligence assisted chatbox in terms of mobile App for our students to learn and self-improve English proficiency effectively. Through assessment reports generated from the mobile App, teachers can effectively monitor students’ training and improving progress in learning English.
Principal Investigator:
Dr. HO To Sum, George
Abstract:
Due to the outbreak of COVID-19, customer behaviour around the world looks completely different today than it did even one year ago. For example, retail sales via e-commerce channels in both the United States and European Union recorded rapid growth (i.e., 15% and 30% respectively) in 2020, while the gross value of retail sales was in decline (OECD, 2020). The same trend could also be observed in Hong Kong. Total retail sales in Hong Kong recorded 11 consecutive months of decline in 2020 (Census and Statistics Department, HKSAR, 2020); by contrast, the value of individual customer purchases via ecommerce platforms doubled in 2020 (Hong Kong Television Network Limited, 2020b). The changes in customer behaviour and the burgeoning of ecommerce purchasing indicated shrinking sales at physical stores and the emergence of the ‘next normal’: B2C e-commerce business. Amidst these changes, the value chain of the retail industry may be reconfigured and Logistics Service Providers (LSPs) are urged to transform their routine operations (i.e., orders placed by wholesalers or retailers) into a sound e-fulfilment process (i.e., orders placed by individual customers via e-commerce) with effective strategies. Recently, research studies on e-commerce industry have focused on improving operational effectiveness and efficiency, warehouse layout optimization, and last-mile delivery (Ranieri et al., 2018; Farooq et al., 2019). Considering the needs of next-day or even same-day deliveries as an e-fulfilment process, ensuring a fast and efficient retrieval of Stock Keeping Units (SKUs) from shelves, has become crucial for today’s LSPs. To meet the trends of the ‘next’ efulfilment ‘normal’, LSPs need to be transformed with additional capabilities for handling discrete and fluctuating e-order demands. However, most LSPs in Hong Kong, especially small and medium (SME) -type LSPs, use rented warehouses to provide their services. They are unable to afford the large investments that would be entailed in adopting an automated storage and retrieval system and a sophisticated order picking system in rented warehouses; this limits their competencies and capabilities in handling e-orders. Therefore, research and development on effective e-fulfilment decision strategies regarding inventory replenishment and operational optimization is needed for enhancing and streamlining e-fulfilment operations. This project aims to design and develop a Federated Learning-based e-fulfilment decision model for overcoming the new challenges presented to the logistics industry by today’s B2C e-commerce business in the wake of the COVID-19 pandemic. This system integrates collaborative machine learning and operational decision modelling to facilitate the transformation from traditional warehouses to e-fulfilment centres. From the perspective of LSPs, the proposed model allows them to generate the optimal pick face replenishment strategy and fully utilize resources for handling the fluctuating demands of eorders without needing to re-construct the whole premises and infrastructure. Considering the limited datasets obtained by SME-type LSPs, this project also contributes to establish an industry-wide solution for estimating quantity per SKU to be held in the pick face area. Through streamlined put-away and order picking in e-fulfilment operations, customer e-orders can be effectively fulfilled by the logistics warehouses, enhancing their online shopping experience. With the aid of the proposed decision model, the capabilities of the e-fulfilment process are enabled for LSPs, resulting in better competitiveness and service coverage when the ‘next normal’ emerges in the B2C e-commerce market.
Principal Investigator:
Dr. Mo Yiu-Wing
Abstract:
Managing the dual channel of logistics resources has become more critical than ever, not only to achieve cost savings via enhanced process efficiency in operations, but also to utilise idle resources within and outside operations for social sustainability. With the success of crowdsourcing logistics platforms in recent years, many companies have sought to outsource some of their logistics orders to crowd networks. However, when compared with internal logistics resources, resources in the crowd network involve higher uncertainty, which creates many challenges for companies in determining the allocation of logistics orders to the crowdsourced platform for the fulfilment of ad-hoc demand. There is a lack of a holistic approach for integrating internal logistics resources among various storage facilities with crowdsourced vehicles via decision intelligence systems.
In this research project, we aim to design an integrated decision framework for managing the dual channel of logistics resources through the adoption of decision intelligence systems. With the support of decision intelligence systems, including systems simulation, data-driven models, and geospatial data analytics, the integration of internal and crowd logistics resources is expected to lead logistics operations to the next stage of operations management. The main contributions of this study are therefore focused on the management theory of dual channel logistics resource management. Apart from the management theory, we will collaborate with a company in this project. The collaborated case study would also serve as a guideline for practitioners.
Principal Investigator:
Dr. Ng Chi-Hung Stephen
Abstract:
The sales volume of e-commerce experienced a rapid growth after the outbreak of COVID-19. Considering that the demands are unpredictable and most of orders are small-sized, this increases the challenges for small and medium (SME) -type companies in handling e-orders in terms of order management, data analysis, demand forecast, and inventory optimization. Digital transformation could be the strategy adopted by companies to transfer their traditional warehouse under the e-commerce new normal. However, according to a survey, about half of SMEs in Hong Kong did not understand how to adopt digital technology and hesitated to perform digital transformation as they believed that could be complex and expensive. Therefore, this research proposes to design a new AI-based intelligent model by integrating the digital technologies and artificial intelligent-based predictive analytics for companies to achieve performance enhancement. With the proposed model, many routine processes could be performed automatically, and human staff would be freed from repetitive tasks to focus on more innovative, value-added, and serviced related jobs. Also, error free operation and efficiency enhancement could be achieved in traditional warehouses which can ultimately facilitates digital transformation.
Principal Investigator:
Dr. Wu Chun-Ho Jack
Abstract:
Digital technologies not only automate financial processes, but they also re-structure the communication channels, develop new business models, identify new markets, etc. This research aims to investigate the affordances and actualisations of recommendation systems based on blockchain technology for insurance and financial products. To analyse the blockchain-based recommendation systems, the research objectives include to identify the critical moment of health-conscious people purchasing an insurance and/or financial product; to characterise the insurance and/or financial products to be recommended, and to extract association rules to enhance the recommendation process.
Principal Investigator:
Dr. Choy Siu-Kai
Abstract:
Image segmentation is a critical problem in computer vision for a wide variety of applications. Among the existing approaches, partial differential equations and variational methods have been extensively studied in the literature. Although most variational approaches use boundary and region information to segment natural and textural images with remarkable success, we note that most of the existing methods only consider simple information/features extracted from a particular image domain (e.g., grey level features in the spatial domain) to characterise image regions. However, such information/features are not informative enough to segment complex images. In the proposed project, we will investigate a robust and effective variational segmentation algorithm to remedy the aforementioned difficulties for a wide range of applications. In particular, we will study a mathematical optimisation framework that integrates the bit-plane-dependence probability models, which are used to characterise local region information extracted from various image domains, with the fuzzy region competition for image segmentation. We will also study the mathematical theory for the segmentation algorithm. The proposed segmentation method will be assessed by extensive and comparative experiments using complex natural and textural images.
Principal Investigator:
Dr. Choy Siu-Kai
Abstract:
Image segmentation is a challenging problem in computer vision and has a wide variety of applications in various fields such as pattern recognition and medical imaging. One of the main approaches to this problem is to perform superpixel segmentation followed by a graph-based methodology to achieve image segmentation. Crucial to the successful image segmentation using this method is the superpixel generation algorithm and superpixel partitioning algorithm. Existing superpixel generation algorithms have various priorities and place emphasis on boundary adherence, superpixel regularity, computational complexity, etc, but normally do not perform well in all of the above simultaneously. Superpixel partitioning algorithms are typically based on graph-based approaches and could have high computational costs, which makes them inefficient in practical contexts. In the proposed project, we will investigate a fast and effective unsupervised fuzzy superpixel-based image segmentation algorithm to remedy the aforementioned difficulties for a wide range of applications. In particular, we will study the combined use of a novel fuzzy clustering-based superpixel generation technique and fuzzy graph-theoretic superpixel partitioning approach for image segmentation applications. The proposed segmentation method will be assessed by extensive comparative experiments using complex natural and textural images.
Principal Investigator:
Dr. Lam Shu-Yan
Abstract:
Supervised learning problems infer functions from labelled training data. Learning lower dimensional subspaces in supervised learning problems is important in applications such as human action recognition, face recognition and object recognition. Dimensionality reduction is performed to remove noise from the data and simplify data analysis. Linear Discriminant Analysis (LDA) and its variants have been shown to be suitable for handling data structures in linear, quadratic and highly non-linear forms. However, conventional LDA formulations suffer from two major limitations. Firstly, they use arithmetic means to represent the class centroids of the input data. However, the arithmetic mean has been shown to not effectively represent these data, especially with data that contains heavy noise and outliers. Secondly, it is difficult to show statistically that the learnt projection vectors are effective in the presence of heavy noise and outliers. Hence, conventional LDA fails to determine the most representative features from the input data.
In the proposed project, we aim to develop a new class of dimensionality reduction techniques for labelled data that can overcome the major limitations of conventional LDA techniques. The core idea is to formulate the dimensionality reduction problem as a set of clustering problems. The novelty of the proposed approach is that unsupervised clustering problems can effectively learn the subspace of the supervised learning problem. Locating effective centroids has been well-studied in clustering research. Furthermore, well-developed theories can be used to analyse the sensitivities of these methods in the presence of heavy noise and outliers. If successful, the proposed study will significantly increase the performance of dimensionality reduction for labelled data using clustering, which will fundamentally improve the way in which useful information can be extracted in many real-world applications.
Principal Investigator:
Prof. Tang Man-Lai
Abstract:
One of the most important challenges in modern survey measurement is the elicitation of truthful answers to sensitive questions about behavior and attitudes (e.g., abortion, illegal drug use and racial prejudice). It has long been well known that accessing information regarding a sensitive characteristic in a population usually induces two notorious issues, namely non-response bias (i.e., respondents refuse to collaborate in the fear of the protection of their confidentiality) and response bias (e.g., respondents answer the sensitive questions but give false answers), which usually induce estimate’s efficiency loss, inflated sampling variance, and biased estimates. Therefore, techniques that guarantee anonymity, minimize the respondents’ feelings of jeopardy, and encourage honest answers are of great demand. In this project, we propose several practical generalizations for the famous item count techniques for sensitive survey questions.
Poisson ICT has recently been developed to overcome the shortages of the conventional item count techniques (ICTs) by replacing the list of independent innocuous (binary) statements by a single innocuous (Poisson) statement. Despite various attractive advantages, Poisson ICT still possesses some limitations. First, it is assumed that respondents will comply with the survey design. Second, it is assumed that the outcome of the innocuous statement follows the less practical Poisson distribution. Third, no regression model has been developed for binary sensitive outcomes.
In this proposal, we plan to
(i) (New Poisson ICT with Non-Compliance) Develop a new Poisson ICT that takes the non-compliance from the respondents into consideration;
(ii) (New Inflated-Zero Poisson ICT) Develop a new Poisson-type ICT that allows the outcome of the innocuous statement follows the more realistic inflated-zero Poisson distribution; and
(iii) (Regression Modeling with Sensitive Outcome) Develop a regression model for binary sensitive outcomes.
Principal Investigator:
Prof. Tang Man-Lai
Abstract:
High dimensional data analysis has become increasingly frequent and important in diverse fields; for example, genomics, health sciences, economics and machine learning. Model selection plays a pivotal role in contemporary scientific discoveries. There have been a large body of works on model selection for complete data. However, complete data are often not available for every subject due to many reasons, including the unavailability of covariate measurements and loss of data. The literature on model selection for high dimensional data in the presence of missing or incomplete values is relatively sparse. Therefore, efficient methods and algorithms for model selection with incomplete data are of great research interest and practical demand.
For model selection, the information criteria (e.g., the Akaike information criterion and the Bayesian information criterion) is commonly applied, and it can be easily incorporated with the famous EM algorithm in the presence of missing values. Generalized EM algorithm has also been developed to update the model and the parameter under the model in each iteration. It performs Expectation step and Model Selection Step alternately, converges globally, and yields a consistent model in model selection. However, it may not always be numerically feasible to perform Model Selection Step, especially for high dimensional data. Therefore, a new method for model selection with high dimensional incomplete data is greatly desirable. Our proposed algorithm in this project will hopefully yield a consistent model in general missing data patterns and have numerical convergences. Moreover, our proposed method is expected to perform efficiently variable selection in linear regression, generalized linear models and model selection of graphical models.
Due to the convenience of its implementation by using standard software modules, multiple imputation is arguably the most widely used approach for handling missing data. It is straightforward to apply an existing model selection method to each imputed dataset. However, it is challenging to combine results on model selection across imputed data sets in a principled framework. To overcome the challenge, many advanced techniques are developed for variable selection problem, such as the group lasso penalty to merged data sets of all imputations, the strategy of stability selection within bootstrap imputation, and random lasso combined with multiple imputation. These techniques are feasible for high-dimensional data with complex missing patterns and have achieved good performance in simulation studies and real data analyses. However, as far as we know, it is very surprising that there is no imputation method for graphical models. An imputation-based method for graphical model selection is greatly desirable. In this project, we investigate bootstrap multiple imputation with stability selection. We expect the proposed method can deal with general missing data patterns.
Principal Investigator:
Dr. Liu Hai
Abstract:
Robotics technologies are advancing rapidly. Groups of robots have been developed that can communicate with each other using wireless transmissions and form robot swarms. Applications of these swarms include surveillance, search and rescue, mining, agricultural foraging, autonomous military units and distributed sensing in micromachinery or human bodies. For example, swarm robots can be sent into the places that are too dangerous for human workers and detect life-signs via infrared sensors. In all of these applications, a group of self-propelled robots move in a cohesive way (i.e., connectivity is preserved during these movements). Such behavior is usually referred to as collective motion. This research aims to design self-adaptive collective motion algorithms for swarm robots in 3D space. The algorithms are expected to be self-adaptive in the sense that robots will be able to dynamically determine proper moving parameters, based on their environments and statuses. Using the proposed collective motion algorithms, robots will be able to move along a pre-planned path from a source to a destination while satisfying the following requirements. 1) The robots will use only one-hop neighbor information. 2) The robots will maintain connectivity of the network topology for information exchange. 3) The robots will maintain a desired neighboring distance. 4) The robots will be capable of bypassing obstacles without partitioning the robot swarm (i.e., member loss). We will develop collective motion algorithms for the following three cases: 1) no obstacles and no leader, 2) no obstacles with a leader; and 3) with obstacles (with and without a leader). We will conduct extensive experiments in testbed robots to examine the performance of the algorithms in practical applications.
Principal Investigator:
Dr. Ho To-Sum
Abstract:
The blooming of e-commerce in the past decade has not only brought significant economic growth to the e-retailers, but also new opportunities and challenges to the logistics industry. To seize the opportunities arising from the emerging e-commerce logistics in Hong Kong, logistics service providers (LSPs) are forced to take on new roles and adjust their operations to fulfill the dynamic customer demand. This research aims to develop a Blockchain-based E-Commerce Analytics Model, integrating blockchain technology and the machine learning algorithm for managing data across the supply chains and predicting dynamic e-commerce order demand.
This research enables industry practitioners, especially LSPs and e-retailers, to plan ahead for the subsequent e-commerce operations. From the perspective of LSPs, the prediction model allows the firm to realize the e-commerce order arrival patterns, enabling flexible re-allocation of the right amount of resources in real time to deal with the hour-to-hour fluctuating arrival of orders in distribution centers. From the perspective of a retailer, the generic prediction model allows the firm to predict, for example, the sales volume among various e-commerce sales channels, the sales volume from different customer segments, and the e-commerce sales performance of different product categories. By tackling the unpredictability of demand in the e-commerce business environment, this research contributes to an effective decision support strategy for logistics `operations planning, hence, enhancing e-commerce logistics competence in Hong Kong.
Principal Investigator:
Dr. Mo Yiu-Wing
Abstract:
Given the current ageing population and limited social welfare expenditure, scholars are renewing their interests in how community organisations can operate to sustainably serve various needs of people with travel inconvenience in society. This research aims to design flexible vehicle management systems that enhance the management of various paratransit services through better system design and optimisation of vehicle resources.
The study scope of paratransit services includes schedule route, dial-a-ride, feeder and pooled dial-a-ride. Users who require those services have different expectations for travelling times, prices, service frequencies as well as pick-up and drop-off locations. This variety of service requirements poses numerous new challenges for community organisations to sustain paratransit services. Hence, it is essential to innovate options for a holistic approach to coordinating various types of service in a common sharing platform, which meet people’s diverse needs in a more efficient way. We expect the outcomes of this research would support the policy review and operational improvements for community organisations.
Principal Investigator:
Dr. Mo Yiu-Wing
Abstract:
With the advanced logistics developments in recent decades, various manufacturers are able to profit from the spare parts service for systems maintenance and to enhance product sustainability by managing the express delivery and the reverse logistics. These advanced logistics developments have driven the evolution of traditional spare parts management into a new service model. Apart from the on-site spare parts management, manufacturers and authorised service providers must offer more customised services and the collection of repairable items from users in the reverse logistics process. However, these evolutionary service requirements introduce procedural complexities and extends the service scope.
In this research, we aim to optimise the process of service parts management through a holistic and adaptive approach. The whole process scope includes logistics network design, inventory and warehouse management, and reverse logistics operations. To identify the numerous factors and parameters during the process optimisation, we will start by standardising a generic process flow of service parts operations that align with companies’ strategic objectives. Then, we will perform data collection to investigate the effects of these factors and their correlations. After identifying the critical factors, we will formulate them into a generic decision model for deriving optimal adaptive policies with a data-driven process control mechanism. A simulation platform will be developed to verify and monitor the proposed solutions. The performance of the optimal adaptive policies will be finally benchmarked with the optimal static policy, which is commonly applied in various industries. These results will provide effective guidelines for the implementation of adaptive process optimisation of service parts operations.
Principal Investigator:
Dr. Wong Siu-kuen Ricky
Abstract:
Past studies on negotiation strategy have emphasised the benefits of different compliance techniques, for example, door-in-the-face, foot-in-the-door, the low-balling techniques, anchoring effect, etc. A growing body of research has shown how negotiators using compliance tactic may obtain better negotiated outcomes. Undoubtedly, the use of these tactics is beneficial when there involves only a one-off negotiation. Now we have seen that many opportunities for negotiation training are available at universities and corporate training courses. And, in a real-life setting, it is often that negotiators involve in repeated negotiation. Coupling this with people’s knowledge in negotiation tactics, it is contentious that the use of compliance tactic is beneficial in the longer run. The adverse effects of compliance tactic have been neglected in research on negotiation. A more thorough understanding of the potential costs resulting from the use of compliance tactics is important for negotiators or practitioners to make an informed decision.