WO2012043981A2 - Méthode et appareil de génération de méta-informations de données de contenu - Google Patents

Méthode et appareil de génération de méta-informations de données de contenu Download PDF

Info

Publication number
WO2012043981A2
WO2012043981A2 PCT/KR2011/005996 KR2011005996W WO2012043981A2 WO 2012043981 A2 WO2012043981 A2 WO 2012043981A2 KR 2011005996 W KR2011005996 W KR 2011005996W WO 2012043981 A2 WO2012043981 A2 WO 2012043981A2
Authority
WO
WIPO (PCT)
Prior art keywords
information
content data
meta information
section
generated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2011/005996
Other languages
English (en)
Korean (ko)
Other versions
WO2012043981A3 (fr
Inventor
이재형
리셔먼링펑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Enswers Co Ltd
Original Assignee
Enswers Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Enswers Co Ltd filed Critical Enswers Co Ltd
Publication of WO2012043981A2 publication Critical patent/WO2012043981A2/fr
Publication of WO2012043981A3 publication Critical patent/WO2012043981A3/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording

Definitions

  • the present invention relates to a method and apparatus for generating meta information of content data, and more particularly, to a method for generating meta information of corresponding content data based on partial or entire DNA information of content data such as audio or video data, and Relates to a device.
  • the present invention has been made in view of the above, and an object thereof is to provide a method and apparatus capable of generating meta information based on partial or entire DNA information of content data.
  • the meta information is independently generated by referring to the exact location of the content data based on the partial or entire DNA information of the content data. It is an object of the present invention to provide a method and apparatus for enabling use.
  • the present invention provides a method for generating meta information of content data, comprising: a first step of determining a time point or a section of content data to be generated as meta information; Extracting partial DNA information associated with the determined time point or section; And a third step of generating meta information including the extracted partial DNA information.
  • the first step may be determined by the user's selection.
  • the first step may be determined based on the associated data associated with the content data.
  • the second stage may be configured to extract partial DNA information for a section of a predetermined time range including the determined viewpoint. It may be.
  • the second step may be configured to extract partial DNA information for at least one partial section of the determined section.
  • the second step may be configured to extract partial DNA information based on at least one of an audio signal and a video signal of content data.
  • the meta information may be configured to be generated by data input by a user.
  • the meta information may be configured to be generated based on the related data associated with the content data.
  • the meta information generated by the third step may be configured to be referred together at the time of reproduction of the content data.
  • an apparatus for generating meta information of content data comprising: a viewpoint and section manager configured to manage a viewpoint or a section of content data to be generated for meta information; A partial DNA extracting unit extracting partial DNA information related to the determined time point or section; And a meta information management unit configured to generate meta information including the extracted partial DNA information.
  • the meta information manager may include a user interface manager that allows a user to input data to generate meta information.
  • a method of generating meta information of content data comprising: a first step of extracting entire DNA information for all of content data to be generated for meta information; A second step of determining a time point or section of the content data to be generated meta information; A third step of including identification information related to the determined time point or section to be identifiable in the entire DNA information; And generating a meta information related to the determined time point or section, including the entire DNA information.
  • the extracted total DNA information may be extracted to correspond to the time information of the content data.
  • the identification information associated with the determined time point or section may be included in the entire DNA information with reference to the time information.
  • the second step may be determined by a user's selection.
  • the second step may be determined based on the related data associated with the content data.
  • the meta information may be generated by data input by a user.
  • the meta information may be generated based on the related data associated with the content data.
  • the entire DNA information for the entire content data to be generated meta-data is extracted, and the content data to be generated meta-information.
  • the meta information manager may include a user interface manager that allows a user to input data to generate meta information.
  • meta information is generated independently by referring to the exact location of the content data based on the partial data or the entire DNA information of the content data. Provided are methods and apparatus for making information available.
  • the meta information generated by the present invention may be used as it is for partially identical content data such as an edited copy and a copy, or may be used as long as the corresponding partial DNA information is the same even when the encoding scheme is different.
  • the meta information generated as described above may be shared or distributed through a channel separate from the content data, the meta information may be easily reproduced.
  • the meta information according to the present invention is based on partial or entire DNA information, it is possible to easily identify the exact position in which the meta information is related in the content data, so that only the non-important part or the important part of the content data is marked, for example, There is such an effect that information can be generated efficiently.
  • FIG. 1 is a flowchart illustrating an embodiment of a method for generating meta information of content data according to the present invention.
  • FIG. 2 is a view for explaining a case where the meta information generated by the present invention can be used.
  • 3 to 5 are diagrams showing an example of the configuration of meta information generated by the present invention.
  • FIG. 6 is a block diagram showing an example of an apparatus for generating meta information of content data according to the present invention.
  • FIG. 7 is a flowchart illustrating a method of generating meta information of content data according to another embodiment of the present invention.
  • FIG. 1 is a flowchart illustrating an embodiment of a method for generating meta information of content data according to the present invention.
  • a time point or a section of content data to be generated for meta information is determined (S100). Determination of the point of view or the section of the content data to be generated meta information is performed by, for example, when the content data is video (video) data. The user can be made by the selection operation. To this end, it is a matter of course that the application program for reproducing the content data should be provided with such a function. Such a function may be added as a plug-in to a program such as a media player, which is a basic playback program provided in a window, or may be performed by a separate application program including such a function. . In the present invention, it is preferable to use a separate playback and meta information editing program that provides an interface for allowing a user to directly generate meta information as described below in addition to the above viewpoint or section selection function.
  • a viewpoint or a section may be determined, where a viewpoint means one point of the content data, and an interval means between one point and another point of the content data.
  • a viewpoint means one point of the content data
  • an interval means between one point and another point of the content data.
  • a viewpoint means one point of the content data
  • an interval means between one point and another point of the content data.
  • the DNA information refers to feature data representing characteristics of the data, and is also referred to as fingerprint data or fingerprint data.
  • fingerprint data or fingerprint data.
  • various methods have been proposed by the prior art, and the use of such DNA information makes it possible to easily determine whether or not the data is identical, and thus has been widely used in the field of copyright management (DRM).
  • DRM copyright management
  • the content data is audio data
  • DNA information is generated using various characteristic data (eg, frequency, amplitude, etc.) representing the characteristics of the audio data.
  • characteristic data eg, frames
  • video data various characteristic data (eg, frames) of the video data are generated.
  • the present invention relates to a method for generating fingerprint (DNA) data of audio or video data and a method for clustering using such fingerprint data.
  • DNA fingerprint
  • Applicants' fingerprint (DNA) generation and extraction method can also be used in the present invention.
  • the present invention can use the conventional technology as it is, regardless of the method used for extracting DNA information from the content data, and it is important to extract partial DNA information related to an arbitrary time point or section of the user-selected content data. .
  • the partial DNA information may be extracted in relation to the time point or the section determined in step S100.
  • the time point is determined in step S100, partial DNA information on a section of a certain time range including the time point may be extracted. For example, a time point, which is predetermined as a starting point, has elapsed. Partial DNA can be extracted for the interval. This is because the extraction of DNA information for a point in time may reduce the discrimination power. If the partial DNA information is extracted from the point in time, for example, after 10 seconds or after 20 seconds, the extracted partial DNA information may have sufficient discrimination power. Can have In this case, of course, the partial DNA may be extracted for a section whose end point is a point in time up to a certain time on the basis of the point in time.
  • partial DNA information on at least one partial section of the determined section may be extracted.
  • the partial DNA information for the interval between 30 seconds and 40 seconds is extracted, and the portion for the interval between 50 seconds and 60 seconds is extracted.
  • DNA information can also be extracted. This is because extracting partial DNA information for the entire section may increase the amount of data if the section is too long.
  • partial DNA information for the entire section between 30 seconds and 60 seconds may be extracted.
  • the 30 second to 60 second sections may be divided into, for example, four relatively short non-overlapping sections, and partial DNA information of each corresponding section may be extracted.
  • partial DNA information may be extracted for a section of 10 seconds before and after 10 seconds, and partial DNA information for a section of 10 seconds before and after 60 seconds may be extracted.
  • the partial DNA information may be extracted based on either the audio signal or the video signal of the content data, or the partial DNA information may be extracted by a combination thereof.
  • meta information including the extracted partial DNA information is generated (S120).
  • the generation of the meta information may be made by a user's input.
  • the application program used in the present invention provides a user interface for inputting meta information in addition to a function for determining a point of view or a section in step S100.
  • a user interface is provided, and the user can input desired meta information through an input device such as a keyboard or a mouse.
  • the meta information may be configured in various forms such as, for example, a time stamp, caption information, lyrics information, content start / end point information, advertisement start point / end point information, and the like.
  • the type of information and the content of the information may be directly selected by the user and input.
  • the type of information may be specified and an interface may be provided for the user to select the type of the information and input the content of the information.
  • the application when inputting caption information for a specific section, when the caption information is selected from the type of information provided in the user interface, the application provides an input interface for inputting the content of the caption and the user desires here. Subtitles can be entered.
  • the application program may generate the input meta information in the form of a file, for example.
  • the meta information generated in the form of a file includes the extracted partial DNA information.
  • Meta information generated in the form of a file may have the same name as the file name of the corresponding content, for example, and may have only a different extension of the file. The meta information generated in this way is referenced and reproduced at the time of execution when the corresponding content data is executed (reproduced). Of course, it is desirable to include such a function in the application used in the present invention to perform such a function.
  • the subtitle information is used by using partial DNA information of a corresponding time point or section included in the meta information. It will be shown on the movie screen.
  • the meta information may be generated continuously.
  • the meta information when the meta information is generated as one file, the meta information may be generated to include a plurality of meta information in one file.
  • the content data is a foreign movie and the Korean subtitle information is generated as the meta information, the Korean subtitle information should be generated continuously over almost all sections of the movie, so that a plurality of meta information may be included in one file. desirable.
  • meta information is generated in this manner, since the meta information is generated based on partial DNA information related to a specific time point or section of the content data, the meta information is partially identical even if not completely identical from beginning to end, for example, an edited copy or a copy of the content data. Meta information can be used as it is in content data.
  • FIG. 2 is a view for explaining a case where the meta information generated by the present invention can be used.
  • meta information is generated for a partial section displayed in white out of the entire content data displayed in black.
  • the case where the content data of (A) further includes other portions indicated by oblique lines is not the same as the content data of (A) but partially identical.
  • the meta information generated in (A) can be used as it is. That is, since the meta information generated in (A) includes the meta information generated based on the extracted partial DNA information in relation to the white section of (A), the same partial DNA information is also included in the case of (B).
  • the meta information generated in (A) can be used as it is in (B).
  • (C) is also partially the same case, in which there is another part indicated by diagonal lines in the front / rear part in addition to the content data of (A).
  • the meta information generated in (A) can be used as it is.
  • the meta information generated by the present invention is not generated based on time information as in the related art, but is generated based on partial DNA information of an arbitrary time point or section, the corresponding time point or section is included.
  • the exact location can be found in the content data, which is a copy or a compilation of all forms.
  • FIG. 3 is a diagram illustrating an example of a configuration of meta information generated by the present invention.
  • the meta information of the present invention may be composed of partial DNA information, meta information, and a time stamp, caption, lyrics, content start point, content end point, advertisement start point, advertisement end point, and user-defined information.
  • the meta information is subtitle information, which includes partial DNA information expressed in binary numbers, and the subtitle information “I really loved you. If yes.
  • Such meta information is used to display the corresponding subtitle information together with the corresponding content data at a time point or section corresponding to the corresponding partial DNA information.
  • the partial DNA information is represented by a binary number, but may be represented by a longer binary number than the actual case.
  • the time stamp information indicates a time from the start time of the corresponding content data of the section including the partial DNA information to the end time of the section.
  • the time stamp information may be utilized in various forms different from the case of FIG.
  • time-lapse information for example, 03:28
  • time-lapse information for example, 03:28
  • the caption information may be configured such that it is displayed, for example, 1:00.
  • FIG. 4 shows that when a content starting point of content data is input, partial DNA information of the corresponding time point and time information of the corresponding time point are recorded.
  • the actual content starting point excluding the advertisement video is recorded.
  • the meta information is used, the user can easily determine the starting point of the corresponding content. You can go to the point where the content starts and play the content from that point.
  • a user-defined item may be included, which allows a user to generate meta information desired by the user. Using this, for example, it is possible to simply write a phrase or a memo that comes to mind while watching the content data.
  • the meta information may include a plurality of information in one file, in this case can be configured as shown in FIG.
  • FIG. 5 illustrates a case in which the case of FIG. 3A and FIG. 4 is included as one meta information, and the caption is output at a point corresponding to the partial DNA information as described in FIG.
  • the point corresponding to the corresponding partial DNA information performs a function of notifying the start point of the content.
  • FIG. 5 illustrates only the case where two meta information is included for convenience of description, a plurality of meta information having two or more as shown in FIG. 5 may be included and generated in one file.
  • FIG. 6 is a diagram illustrating an example of an apparatus for generating meta information for implementing the method for generating meta information as described above.
  • the meta information generating apparatus 10 includes a viewpoint and section manager 11, a partial DNA extractor 12, and a meta information manager 13.
  • the viewpoint and section management unit 11 performs a function of managing the viewpoint or the section of the content data that is the object to generate the meta information. As described with reference to FIG. It generates and manages information on the section, and transmits the information on the time or section to the partial DNA extracting unit 12 and the meta information management unit 13.
  • the partial DNA extracting unit 12 is a means for performing a function of extracting partial DNA information related to the determined time point or section of the content data to be generated meta information, as described with reference to FIG. Extract the partial DNA associated with the point in time or section and transmit it to the meta-information management unit 13.
  • the meta information management unit 13 performs a function of generating and managing meta information including the extracted partial DNA information.
  • the meta information management unit 13 includes partial DNA information in the form of meta information as described in FIGS. 3 and 4. Role to create.
  • the meta information manager 13 also manages a user interface (not shown) for receiving meta information from the user.
  • the meta-information generating device 10 of FIG. 6 may be implemented as an application program in an electronic device such as a computer or a smart phone, and although not shown, a video playback function and a user may select a viewpoint or a section. And a function and an interface for allowing a user to input meta information.
  • the view point or the section may be determined based on the related data related to the content data, and the meta information may be generated. For example, when the text-based subtitle information generated based on the time information exists as a separate file as the associated data, the view point or the section of the content data is matched according to the time information of the corresponding subtitle information file, and the corresponding view or After extracting the partial DNA information associated with the interval, it may be configured to generate caption data corresponding to the time information as meta information.
  • the existing meta information can be accurately matched to a time point or section based on time information. Therefore, partial DNA information related to the time point or section is extracted and the meta information is input based on the user's input. It can be created automatically without
  • FIG. 7 is a flowchart illustrating a method of generating meta information of content data according to another embodiment of the present invention.
  • the entire DNA information of the entire content data to be generated meta information is extracted (S200).
  • the entire DNA information is extracted to correspond to the time information of the content data, and is generated to match each time information. For example, by dividing by one second unit, DNA information at each time point is extracted and generated for the entire time interval such as DNA information in one second, DNA information in two seconds, ..., and the like.
  • the DNA information at each time point is preferably configured to extract DNA information for a time interval within a predetermined range including the corresponding time point.
  • the DNA information in one second may be composed of DNA information extracted for a section between 1 second and 10 seconds
  • the DNA information in 2 seconds may be composed of DNA information extracted for a section between 2 seconds and 11 seconds. desirable.
  • identification information related to the determined viewpoint or section is included in the entire DNA information so as to be identified (S220).
  • the identification information associated with the determined time point or section is included in the entire DNA information generated in response to the time information in step S200. That is, it means that the determined time point or section is marked with reference to time information of all DNA information. In this way, it is possible to determine where the determined time point or section is located in the overall DNA information.
  • meta information related to the determined time point or section is generated including the entire DNA information generated as described above (S230). Since the meta information generation here is the same as described with reference to FIGS. 1 to 6, detailed description thereof will be omitted. However, since the meta information generation at this time includes the entire DNA information, partial DNA information as described with reference to FIGS. 3 and 4 is not necessary and thus generated.
  • the DNA information of the entire content data is extracted in advance so as to correspond to the time information, and the time point or section to which meta information is to be generated is referenced with reference to the time information of the extracted DNA information, thereby marking the corresponding time point. Or it is effective in that the section can be easily visited.
  • the determination of the viewpoint or the section and the generation of the meta information may be performed by the selection / input of the user or automatically generated based on the related data as described with reference to FIGS. 1 to 6. .
  • meta information generated by the present invention can be used as it is for partially identical content data such as edited copies and copies, and can be used as long as the corresponding partial DNA information is the same even when encoding methods are different.
  • the meta information generated as described above may be shared or distributed through a channel separate from the content data.
  • the content data producer may also generate and distribute meta information along with the content data at the production stage.
  • the meta information generated in this way uses DNA information, accurate positioning can be performed. For example, only important parts or important parts of the content data can be marked so as to cover the non-important parts and reproduce them in a digest format. . This will allow you to watch hours of drama footage in dozens of minutes.
  • the meta information according to the present invention can be shared and distributed separately from the content data, and since the individual user can easily re-edit the corresponding meta information, the meta information can be easily reproduced. .

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Television Signal Processing For Recording (AREA)
  • Saccharide Compounds (AREA)

Abstract

Cette invention concerne une méthode et un appareil de génération de méta-informations de données de contenu, ladite méthode comprenant : une première étape consistant à déterminer un moment ou une période de données de contenu où des méta-informations doivent être générées ; une deuxième étape consistant à extraire des données partielles d'ADN associées au moment ou à la période déterminé(e) ; et une troisième étape consistant à générer des méta-informations contenant les données partielles d'ADN extraites. Selon la présente invention, les méta-informations peuvent être générées d'après des données partielles ou intégrales d'ADN des données de contenu, ce qui permet un positionnement précis et une aptitude à être utilisée dans une version éditée et une copie d'un autre format.
PCT/KR2011/005996 2010-09-30 2011-08-16 Méthode et appareil de génération de méta-informations de données de contenu Ceased WO2012043981A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020100095336A KR101300220B1 (ko) 2010-09-30 2010-09-30 콘텐츠 데이터의 메타 정보 생성 방법 및 장치
KR10-2010-0095336 2010-09-30

Publications (2)

Publication Number Publication Date
WO2012043981A2 true WO2012043981A2 (fr) 2012-04-05
WO2012043981A3 WO2012043981A3 (fr) 2012-05-31

Family

ID=43615587

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/005996 Ceased WO2012043981A2 (fr) 2010-09-30 2011-08-16 Méthode et appareil de génération de méta-informations de données de contenu

Country Status (2)

Country Link
KR (1) KR101300220B1 (fr)
WO (1) WO2012043981A2 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102424839B1 (ko) * 2015-10-14 2022-07-25 삼성전자주식회사 디스플레이 장치 및 이의 제어 방법
KR102701785B1 (ko) * 2022-08-25 2024-08-30 한리경 의미 단위 이동이 가능한 미디어 플레이어를 갖는 사용자 단말장치 및 그의 동작 방법

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100456408B1 (ko) * 2004-02-06 2004-11-10 (주)뮤레카 오디오유전자 생성방법 및 오디오데이터 검색방법
KR20090079563A (ko) * 2008-01-18 2009-07-22 주식회사 한국유비쿼터스기술센터 동영상 메타 정보 운용 방법 및 시스템과 이를 위한기록매체
US20090248300A1 (en) * 2008-03-31 2009-10-01 Sony Ericsson Mobile Communications Ab Methods and Apparatus for Viewing Previously-Recorded Multimedia Content from Original Perspective
KR101060636B1 (ko) * 2008-09-26 2011-08-31 (주)뮤레카 유전자 정보를 이용한 멀티미디어 콘텐츠 파일 관리 시스템

Also Published As

Publication number Publication date
KR20110010082A (ko) 2011-01-31
WO2012043981A3 (fr) 2012-05-31
KR101300220B1 (ko) 2013-08-26

Similar Documents

Publication Publication Date Title
WO2016060358A1 (fr) Appareil et procédé de traitement de vidéo
WO2010038927A1 (fr) Appareil et procédé de gestion d’un contenu multimédia
JP5374758B2 (ja) コメント配信システム、コメント配信システムの動作方法、プログラム
WO2022010159A1 (fr) Procédé d'enregistrement d'écran, et dispositif d'enregistrement d'écran mettant en œuvre ce procédé
US8799774B2 (en) Translatable annotated presentation of a computer program operation
WO2015012580A1 (fr) Dispositif permettant de fournir des commentaires et des informations statistiques sur des sections respectives d'une vidéo, et procédé associé
KR20010042221A (ko) 멀티미디어 콘텐츠 기재 시스템 및 방법
JP2004228779A (ja) 情報処理装置
WO2015012581A2 (fr) Dispositif de fourniture, d'édition et de lecture d'un contenu vidéo et procédé associé
WO2017119782A1 (fr) Procédé et dispositif permettant de lire une vidéo par chaque segment de musique
WO2017164510A2 (fr) Procédé de marquage de contenu multimédia basé sur des données vocales, et système l'utilisant
WO2019103231A1 (fr) Dispositif de reproduction d'informations de résumé vidéo, serveur de fourniture d'informations de résumé vidéo, et procédé associé
WO2021091003A1 (fr) Procédé de gestion des droits d'auteur d'un contenu
KR20070122274A (ko) 포터블 기기의 파일 관리 방법 및 장치
WO2021054525A1 (fr) Procédé d'entrée de mémo dans un lecteur vidéo et serveur l'utilisant
WO2012043981A2 (fr) Méthode et appareil de génération de méta-informations de données de contenu
WO2022114356A1 (fr) Système et procédé d'édition de vidéo, à l'aide d'une technologie de diffusion en continu
EP2359329A2 (fr) Méthode et appareil de reproduction de contenus utilisant des métadonnées
WO2010076917A2 (fr) Procédé de fonctionnement de récepteur de diffusion stockant un programme de diffusion et récepteur de diffusion exploitant le procédé
WO2009145507A1 (fr) Procédé pour délivrer en sortie un guide de programmes électronique et récepteur de radiodiffusion de guide permettant le procédé
JP5410128B2 (ja) コンテンツ表示制御装置、コンテンツ表示制御方法、プログラム、記録媒体
EP2260455A2 (fr) Appareil et procédé de reproduction de contenus
WO2019226031A1 (fr) Système de visualisation à fenêtres multiples comprenant un éditeur pour une vidéo de réaction et procédé de production d'une vidéo de réaction en utilisant celui-ci
US20140105570A1 (en) System and method for constructing scene clip, and record medium thereof
WO2023008838A1 (fr) Dispositif électronique apte à réaliser une synchronisation entre un document et une voix par mise en correspondance entre une voix et un objet d'édition, et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11829487

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC, EPO-FORM 1205A DATED 12.07.13

122 Ep: pct application non-entry in european phase

Ref document number: 11829487

Country of ref document: EP

Kind code of ref document: A2