Please use this identifier to cite or link to this item:
https://dspace.univ-ouargla.dz/jspui/handle/123456789/28795
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | BENSID, KHALED | - |
dc.contributor.author | Azzaoui, Mohammed | - |
dc.contributor.author | Defairat, Walid | - |
dc.date.accessioned | 2022-04-29T16:59:50Z | - |
dc.date.available | 2022-04-29T16:59:50Z | - |
dc.date.issued | 2020-10-22 | - |
dc.identifier.uri | http://dspace.univ-ouargla.dz/jspui/handle/123456789/28795 | - |
dc.description | Telecommunication system | en_US |
dc.description.abstract | In this work, we extracted information from the important areas of parents and children that are then compared between father and son. Between father and daughter. Between mother and son or between mother and daughter to know the kinship between them, this is done by applying different algorithms in terms of type (texture, appearance and geometry) because they differ in principle in the KinFaceW-II database that contains 1000 pairs of pictures of parents and children. A logical comparison between these methodologies is one of these methods that can provide the most accurate results regarding our hypothesis, so we concluded that the algorithm gave the best percentage of 90.25%, using Matlab, and this result is encouraging and promising according to the results of recent studies in this regard | en_US |
dc.language.iso | en | en_US |
dc.publisher | University of Kasdi Merbah Ouargla | en_US |
dc.subject | face | en_US |
dc.subject | kinship | en_US |
dc.subject | algorithms | en_US |
dc.subject | الوجه | en_US |
dc.subject | القرابة | en_US |
dc.title | Deep Learning feutures for aoutomatic Kinship Verification | en_US |
dc.type | Thesis | en_US |
Appears in Collections: | Département d'Electronique et des Télécommunications - Master |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Azzaoui-Defairat.pdf | 2,42 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.