alfresco-community-repo/source/java/org/alfresco/repo/content/metadata/TikaAutoMetadataExtracterTest.java
Kevin Roast f651abe34a Merged BRANCHES/DEV/V4.1-BUG-FIX to HEAD
43598: Merged HEAD to BRANCHES/DEV/V4.1-BUG-FIX *RECORD ONLY*
             41906: ALF-11378: REST API has been modified to return extra information about a user whether s/he belongs to a group or not.
   44003: Merged BRANCHES/DEV/BELARUS/V4.1-BUG-FIX-2012_11_22 to BRANCHES/DEV/V4.1-BUG-FIX:
            ALF-15210: Inconsistency in the '?' icon over the dashlets projects\slingshot\source\web\js\share.js DashletTitleBarActions_onReady() function was updated. Fix initialize the style for actionsNode elements into the DOM for IE.
   44004: Merged BRANCHES/DEV/BELARUS/V4.1-BUG-FIX-2012_11_22 to V4.1-BUG-FIX:
            ALF-15793: edit offline hides version history projects\slingshot\source\web\components\document-details\document-actions.js onActionUploadNewVersion function was updated. Fix sets version variable correctly now using asset.workingCopy property instead asset.custom property, which is undefined.
   44018: ALF-16540 : CMIS: createDocument with VersioningState.CHECKEDOUT causes NodeLockedException for types with mandatory versionable aspect
            The ckeck for lock was disabled when beforeCreateVersion policy is handled.
            Version could be created for a locked node. 
   44054: Fix for ALF-16337. Datalist assignee not searchable by full name.
   44056: Trivial change. Fixing some compiler warnings under org.alfresco.repo.content.metadata including a noisy Tika one.
   44143: Merged BRANCHES/DEV/BELARUS/V4.1-BUG-FIX-2012_10_19 to BRANCHES/DEV/V4.1-BUG-FIX:
            42989: ALF-16331: Wrong user for "completed by" information provided in Group Review And Approve workflow
   44147: Merged BRANCHES/DEV/V3.4-BUG-FIX to BRANCHES/DEV/V4.1-BUG-FIX:
   44146: Merged BRANCHES/DEV/BELARUS/V3.4-BUG-FIX-2012_05_22 to BRANCHES/DEV/V3.4-BUG-FIX:
            37733: ALF-12051: Webdav - Cannot open files containing "?" character in the filename in WinXP
   44152: ALF-17009 : Merged V3.4-BUG-FIX (3.4.12) to V4.1-BUG-FIX (4.1.3)
            44151: ALF-14035 Tiny HTML file that causes Jodconverter to launch a 100% CPU soffice instance
               - HTML to PDF is now done via ODT as the direct transform hangs if there are <sub> tags in the HTML.
               - Added in 'unsupportedTransformations' to stop a bare transformer.JodConverter from doing HTML to PDF 
               - TransformerDebug test file debugTransformers.txt no longer needs to be 18 bytes, as it made it too fiddly.
               - Modified debug from RuntimeExec so less editing is required to running from the command line
            - Removed tabs that had been added to enterprise/content-services-context.xml in 4.1-BUG-FIX
   44192: ALF-16560 - CIFS: Word document version history lost after saving content in Word:mac 2011 on Mac Mountain Lion
   44224: ALF-16896 Exception with TIKA meta data extractor.
            - Patch POI to handle parsing of Unicode properties that starts on a 4 byte boundary
              rather than the specified offset. Example file was created using http://www.aspose.com/
   44241: Merged DEV to V4.1-BUG-FIX
            44208: ALF-14591 : Ordering not supported for IMAP properties defining IMAP sort fields in Share
                   Make properties from imap:imapContent aspect indexable for SOLR. 
   44253: Merged BRANCHES/DEV/AMILLER/CLOUD1 to BRANCHES/DEV/V4.1-BUG-FIX:
            38927: CLOUD-128 - Update rules works incorrectly
          This is a partial fix for ALF-14568. The rest is coming in a separate check-in.
          I made some minor adjustments to this change - trivial spelling fix and whitespace changes.
   44257: ALF-16563 - CIFS: Image document version history lost after saving content in Preview on Mac Mountain Lion
   44260: Fix for ALF-16430 - List of values shown in alphabetical order in Share Forms. Values now only sorted if the Forms config 'field' element has the sorted='true' attribute.
   44269: Completion of fix for ALF-14568 - Update rule works incorrectly.
   44318: Fix for ALF-17055 - remoteadm webscript set a Last-Modified HTTP header whose date format does not conform to RFC 2616 hence breaking proxy caching
   44320: Fix for ALF-16463 - documentLibrary RSS feed does not pass the w3c validator, in particular pubDate breaks RFC-822, date not displayed when using non English locale
   44352: Merged BRANCHES/DEV/BELARUS/V4.1-BUG-FIX-2012_11_12 to BRANCHES/DEV/V4.1-BUG-FIX
            43860: ALF-16263: Search using a "Stop Word" not displaying any result

git-svn-id: https://svn.alfresco.com/repos/alfresco-enterprise/alfresco/HEAD/root@44459 c4b6b30b-aa2e-2d43-bbcb-ca4b014f7261
2012-12-07 14:04:23 +00:00

322 lines
13 KiB
Java

/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.repo.content.metadata;
import java.io.File;
import java.io.Serializable;
import java.net.URL;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Set;
import org.alfresco.model.ContentModel;
import org.alfresco.repo.content.filestore.FileContentReader;
import org.alfresco.repo.content.transform.AbstractContentTransformerTest;
import org.alfresco.service.cmr.repository.ContentReader;
import org.alfresco.service.cmr.repository.datatype.DefaultTypeConverter;
import org.alfresco.service.namespace.NamespaceService;
import org.alfresco.service.namespace.QName;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.tika.config.TikaConfig;
import org.apache.tika.io.TikaInputStream;
import org.apache.tika.metadata.Metadata;
import org.apache.tika.mime.MediaType;
import org.apache.tika.parser.AutoDetectParser;
import org.apache.tika.parser.ParseContext;
import org.apache.tika.parser.Parser;
import org.apache.tika.parser.microsoft.OfficeParser;
import org.apache.tika.parser.microsoft.ooxml.OOXMLParser;
import org.apache.tika.parser.mp3.Mp3Parser;
import org.apache.tika.parser.odf.OpenDocumentParser;
/**
* @see TikaAutoMetadataExtracter
*
* @author Nick Burch
*/
public class TikaAutoMetadataExtracterTest extends AbstractMetadataExtracterTest
{
private static Log logger = LogFactory.getLog(TikaAutoMetadataExtracterTest.class);
private TikaAutoMetadataExtracter extracter;
private static final QName TIKA_MIMETYPE_TEST_PROPERTY =
QName.createQName("TikaMimeTypeTestProp");
@Override
public void setUp() throws Exception
{
super.setUp();
TikaConfig config = (TikaConfig)ctx.getBean("tikaConfig");
extracter = new TikaAutoMetadataExtracter(config);
extracter.setDictionaryService(dictionaryService);
extracter.register();
// Attach some extra mappings, using the Tika
// metadata keys namespace
// These will be tested later
HashMap<String, Set<QName>> newMap = new HashMap<String, Set<QName>>(
extracter.getMapping()
);
Set<QName> tlaSet = new HashSet<QName>();
tlaSet.add(TIKA_MIMETYPE_TEST_PROPERTY);
newMap.put( Metadata.CONTENT_TYPE, tlaSet );
extracter.setMapping(newMap);
}
/**
* @return Returns the same transformer regardless - it is allowed
*/
protected MetadataExtracter getExtracter()
{
return extracter;
}
public void testSupports() throws Exception
{
ArrayList<String> mimeTypes = new ArrayList<String>();
for (Parser p : new Parser[] {
new OfficeParser(), new OpenDocumentParser(),
new Mp3Parser(), new OOXMLParser()
}) {
Set<MediaType> mts = p.getSupportedTypes(new ParseContext());
for (MediaType mt : mts)
{
mimeTypes.add(mt.toString());
}
}
for (String mimetype : mimeTypes)
{
boolean supports = extracter.isSupported(mimetype);
assertTrue("Mimetype should be supported: " + mimetype, supports);
}
}
/**
* Test several different files
* Note - doesn't use extractFromMimetype
*/
public void testSupportedMimetypes() throws Exception
{
String[] testFiles = new String[] {
".doc", ".docx", ".xls", ".xlsx",
".ppt", ".pptx",
//".vsd", // Our sample file lacks suitable metadata
"2010.dwg",
"2003.mpp", "2007.mpp",
".pdf",
".odt",
};
AutoDetectParser ap = new AutoDetectParser();
for (String fileBase : testFiles)
{
String filename = "quick" + fileBase;
URL url = AbstractContentTransformerTest.class.getClassLoader().getResource("quick/" + filename);
File file = new File(url.getFile());
// Cheat and ask Tika for the mime type!
Metadata metadata = new Metadata();
metadata.set(Metadata.RESOURCE_NAME_KEY, filename);
MediaType mt = ap.getDetector().detect(TikaInputStream.get(file), metadata);
String mimetype = mt.toString();
if (logger.isDebugEnabled())
{
logger.debug("Detected mimetype " + mimetype + " for quick test file " + filename);
}
// Have it processed
Map<QName, Serializable> properties = extractFromFile(file, mimetype);
// check we got something
assertFalse("extractFromMimetype should return at least some properties, " +
"none found for " + mimetype + " - " + filename,
properties.isEmpty());
// check common metadata
testCommonMetadata(mimetype, properties);
// check file-type specific metadata
testFileSpecificMetadata(mimetype, properties);
}
}
@Override
protected boolean skipAuthorCheck(String mimetype) { return true; }
@Override
protected boolean skipDescriptionCheck(String mimetype)
{
if(mimetype.endsWith("/ogg"))
{
return true;
}
return false;
}
/**
* We also provide the creation date - check that
*/
protected void testFileSpecificMetadata(String mimetype,
Map<QName, Serializable> properties)
{
// Check for extra fields
// Author isn't there for the OpenDocument ones
if(mimetype.indexOf(".oasis.") == -1 && !mimetype.endsWith("/ogg") && !mimetype.endsWith("dwg"))
{
assertEquals(
"Property " + ContentModel.PROP_AUTHOR + " not found for mimetype " + mimetype,
"Nevin Nollop",
DefaultTypeConverter.INSTANCE.convert(String.class, properties.get(ContentModel.PROP_AUTHOR)));
}
// Ensure that we can also get things which are standard
// Tika metadata properties, if we so choose to
assertTrue(
"Test Property " + TIKA_MIMETYPE_TEST_PROPERTY + " not found for mimetype " + mimetype,
properties.containsKey(TIKA_MIMETYPE_TEST_PROPERTY)
);
assertEquals(
"Test Property " + TIKA_MIMETYPE_TEST_PROPERTY + " incorrect for mimetype " + mimetype,
mimetype,
DefaultTypeConverter.INSTANCE.convert(String.class, properties.get(TIKA_MIMETYPE_TEST_PROPERTY)));
// Extra media checks for music formats
if(mimetype.startsWith("audio"))
{
assertEquals(
"Property " + ContentModel.PROP_AUTHOR + " not found for mimetype " + mimetype,
"Hauskaz",
DefaultTypeConverter.INSTANCE.convert(String.class, properties.get(ContentModel.PROP_AUTHOR)));
QName artistQ = QName.createQName(NamespaceService.AUDIO_MODEL_1_0_URI, "artist");
assertEquals(
"Property " + artistQ + " not found for mimetype " + mimetype,
"Hauskaz",
DefaultTypeConverter.INSTANCE.convert(String.class, properties.get(artistQ)));
}
}
/**
* We don't have explicit extractors for most image and video formats.
* Instead, these will be handled by the Auto Tika Parser, and
* this test ensures that they are
*/
@SuppressWarnings("deprecation")
public void testImageVideo() throws Throwable {
Map<String, Serializable> p;
// Image
p = openAndCheck(".jpg", "image/jpeg");
assertEquals("409 pixels", p.get("Image Width"));
assertEquals("92 pixels", p.get("Image Height"));
assertEquals("8 bits", p.get("Data Precision"));
p = openAndCheck(".gif", "image/gif");
assertEquals("409", p.get("width"));
assertEquals("92", p.get("height"));
p = openAndCheck(".png", "image/png");
assertEquals("409", p.get("width"));
assertEquals("92", p.get("height"));
assertEquals("8 8 8", p.get("Data BitsPerSample"));
assertEquals("none", p.get("Transparency Alpha"));
p = openAndCheck(".bmp", "image/bmp");
assertEquals("409", p.get("width"));
assertEquals("92", p.get("height"));
assertEquals("8 8 8", p.get("Data BitsPerSample"));
// Geo tagged image
p = openAndCheck("GEO.jpg", "image/jpeg");
// Check raw EXIF properties
assertEquals("100 pixels", p.get("Image Width"));
assertEquals("68 pixels", p.get("Image Height"));
assertEquals("8 bits", p.get("Data Precision"));
// Check regular Tika properties
assertEquals(QUICK_TITLE, p.get(Metadata.COMMENT));
assertEquals("canon-55-250", p.get(Metadata.SUBJECT));
// Check namespace'd Tika properties
assertEquals("12.54321", p.get("geo:lat"));
assertEquals("-54.1234", p.get("geo:long"));
assertEquals("100", p.get("tiff:ImageWidth"));
assertEquals("68", p.get("tiff:ImageLength"));
assertEquals("Canon", p.get("tiff:Make"));
assertEquals("5.6", p.get("exif:FNumber"));
// Map and check
Map<QName, Serializable> properties = new HashMap<QName, Serializable>();
ContentReader reader = new FileContentReader(open("GEO.jpg"));
reader.setMimetype("image/jpeg");
extracter.extract(reader, properties);
// Check the geo bits
assertEquals(12.54321, properties.get(ContentModel.PROP_LATITUDE));
assertEquals(-54.1234, properties.get(ContentModel.PROP_LONGITUDE));
// Check the exif bits
assertEquals(100, properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "pixelXDimension")));
assertEquals(68, properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "pixelYDimension")));
assertEquals(0.000625, properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "exposureTime")));
assertEquals(5.6, properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "fNumber")));
assertEquals(false, properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "flash")));
assertEquals(194.0, properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "focalLength")));
assertEquals("400", properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "isoSpeedRatings")));
assertEquals("Canon", properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "manufacturer")));
assertEquals("Canon EOS 40D", properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "model")));
assertEquals("Adobe Photoshop CS3 Macintosh", properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "software")));
assertEquals(null, properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "orientation")));
assertEquals(240.0, properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "xResolution")));
assertEquals(240.0, properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "yResolution")));
assertEquals("Inch", properties.get(QName.createQName(NamespaceService.EXIF_MODEL_1_0_URI, "resolutionUnit")));
}
private File open(String fileBase) throws Throwable {
String filename = "quick" + fileBase;
URL url = AbstractContentTransformerTest.class.getClassLoader().getResource("quick/" + filename);
File file = new File(url.getFile());
assertTrue(file.exists());
return file;
}
private Map<String, Serializable> openAndCheck(String fileBase, String expMimeType) throws Throwable {
// Get the mimetype via the MimeTypeMap
// (Uses Tika internally for the detection)
File file = open(fileBase);
ContentReader detectReader = new FileContentReader(file);
String mimetype = mimetypeMap.guessMimetype(fileBase, detectReader);
assertEquals("Wrong mimetype for " + fileBase, mimetype, expMimeType);
// Ensure the Tika Auto parser actually handles this
assertTrue("Mimetype should be supported but isn't: " + mimetype, extracter.isSupported(mimetype));
// Now create our proper reader
ContentReader sourceReader = new FileContentReader(file);
sourceReader.setMimetype(mimetype);
// And finally do the properties extraction
return extracter.extractRaw(sourceReader);
}
}