From 90a78359bd8de7e9da6ab97ab0dec38cd3a9577c Mon Sep 17 00:00:00 2001 From: Dave Ward Date: Fri, 2 Dec 2011 14:25:48 +0000 Subject: [PATCH] Merged V3.4-BUG-FIX to HEAD 31682: Fix for ALF-9504 - Upload non-flash fallback fails Merged HEAD to V3.4-BUG-FIX 31065: Fixed ALF-10407 "Share HTML uploader broken in Swift" 31738: merged DEV to V3.4_BUG_FIX 31681 : ALF-7859 - Deployment fails for *.xml content with wcm-xml-metadata-extracter-context.xml enabled 31755: Fix for ALF-9257: merged in and optimised Belarus fix. 31775: Fixed ALF-10667: WCM - Validation issue with xf:switch web forms 31817: Spanish: Updates translations (based on: r31738) & adds new WCM translations. 31840: Fix for ALF-10282 - Web Browser freezes with large xml files Web form transformation 31843: ALF-9208 Performance issue, during load tests /share/page/user/user-sites is showing to be the most expensive. Modification to AuthorityDAOImpl.findAuthorities(...) to use childAuthorityCache when possible Big improvement to 'My Sites' 31850: Italian: Translation updates, inc. fix for: ALF-11293. 31867: Merged DEV/TEMPORARY to V3.4-BUG-FIX 31400: ALF-10764: PDF vs 1.5 cause crash jvm - PDFRenderer library has been updated from 2009-09-27 to 0.9.1 version to support PDF documents of 1.5 version 31906: ALF-9545: Adjust date picker for IE 31911: Merge PATCHES/V3.3.3 to V3.4-BUG-FIX (3.4.7) 31905: ALF-10309 CLONE -WebDAV - Cancelling "save as" upload will create 0 byte content - Run Timer as the original user - Run timer if there is a LOCK timeout (not run if not so not to break standard. MS Office uses a 3 minute timeout) - PUT method clears aspect BEFORE it starts processing the content, so that the Timer does not remove the node if the content is very large - Delete node faster (than the Timer) if the client issues an UNLOCK, having locked the node but not issued a PUT. - Lots of debug 31708: ALF-10309 CLONE -WebDAV - Cancelling "save as" upload will create 0 byte content 'runAsSystem' the 5 minute timer to remove LOCKED but not PUT WebDAV files 31698: ALF-10309 CLONE -WebDAV - Cancelling "save as" upload will create 0 byte content Added 5 minute timer to remove LOCKED but not PUT WebDAV files 31687: ALF-10309 CLONE -WebDAV - Cancelling "save as" upload will create 0 byte content Added missing sys:webdavNoContent aspect to system model. 31913: Merge PATCHES/V3.4.1 (3.4.1.22) to V3.4-BUG-FIX (3.4.7) 31876: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode Use a retrying non-propagating (new) transaction to get the licence key in order to avoid intermittent issues. 31929: Merged BRANCHES/DEV/BELARUS/V3.4-BUG-FIX-2011_11_09 to BRANCHES/DEV/V3.4-BUG-FIX 31903: ALF-9971: RM search doesn't work with NAME:any_text: Check for dublicate template of properties was added. Throws an exception if there is already an entry in the map entry. 31931: ALF_9678 Fixed null pointer issues in WorkflowManager methods 31938: Merged DEV to V3.4-BUG-FIX 31932: ALF-9566: hasMoreItems always false in CMIS query (Web Service binding) 1. PropertyFilter was fixed to be able to handle OpenCMIS Workbench. Unit test was updated. 2. hasMoreItems was adopted to indicate whether more itemsa are avliable in the repo based on maxItems and skipCount. Unit test was added. 3. CMISFolderTypeDefinition was fixed to be fileable according to 2.1.5.1 File-able Objects. 31965: German: Translation updates and additions based on EN r31738 31967: French: Translation updates and additions based on EN r31738 31969: Spanish: : Translation updates and additions based on EN r31738 31971: Italian: Translation updates and additions based on EN r31738 31972: Fix for patch 'alternatives': Alternative patch must have executed and not just been marked as successful 31973: Fixed ALF-11489: 'patch.sitesSpacePermissions' failed on upgrade 2.2.8 -> 3.4.6 - 'patch.sitesFolder' is an alternative to 'patch.sitesSpacePermissions' - Note: Requires rev 31972 for fix to PatchService use of alternatives 31994: ALF-11495 CLONE - Enterprise unlimited licenses still get invalidated turning the system into read-only mode - Replaced DescriptorDAORetryingInterceptor (add in the hotfix) with a RetryingTransactionInterceptor 31999: Change the low level CIFS packet reading code to read/process up to 4 requests per thread run. ALF-9540 Reduces thread blocking when the CIFS client uses overlapped I/O, and also keeps writes in their original sequence. 32037: Japanese: Translation update, new and modified strings based on EN r31738 32061: ALF-11376 Requesting PDFBox 1.6 be included in future service pack release. Upgrading pdfbox,fontbox,jempbox from 1.5.0 to 1.6.0 32074: ALF-11522 IMAP: Generic AlfrescoImapFolderException error is a bit misleading "Can't create folder - Permission denied" --> "Cannot perform action - permission denied" 32086: ALF-9971 RM search doesn't work with NAME:any_text - fix to test failure to do with upper case defaultFieldName finding nothing 32093: Merged BELARUS/V3.4-BUG-FIX-2011_10_13 to V3.4-BUG-FIX (3.4.7) Plus a little bit of refactoring to remove duplicate code 31490: ALF-9817: IE strips exe extension on download file when using download url with ticket parameter in code The "filename" part for "Content-Disposition" header in case of "attachment" for IE 32115: ALF-11569: Merged V3.3 to V3.4-BUG-FIX 32108: ALF-11571: Fix new deadlock in NIO CIFSRequestHandler - Needed to be able to get a thread safe estimate of the number of registered sessions without synchronizing on m_selector.keys() because a lock is held by the selector whilst waiting - Now the session count is maintained by the main thread, which is woken by anything wanting a session count. 32136: ALF-10412 Nonreducing 100% CPU Uploading Large Files to Share Site Document Library Reducing the priority of the async thread pool that is used to perform the transformations so that normal activity (and even garbage collection) is not interrupted by transformations. 32143: MERGED 2011_11_09 to V3.4-BUG-FIX 32133 - ALF-11193 Consumer role cannot Unscribe/subscribe the IMAP folders. 32137 32152: Merged BRANCHES/DEV/BELARUS/V3.4-BUG-FIX-2011_10_13 to BRANCHES/DEV/V3.4-BUG-FIX: 31731: ALF-6275: Discrepancy detected on archived pivot language with EditionService 32171: ALF-9638: Version2ServiceImpl now freezes aspect specific associations, in line with VersionServiceImpl 32191: Merged DEV to V3.4-BUG-FIX 32187: ALF-10884: A file renamed using the web UI still appears in a NFS mount but with NULL stats - Timestamp propogation in case of move - getPaths() call removed from the NodeMonitor 32192: Reversed out rev 32143: ALF-11193: Consumer role cannot Unscribe/subscribe the IMAP folders. - Patch is using SearchService - Patch doesn't transfer IMAP 'unsubscriptions' - Patch will not scale 32211: Merged V3.4 to V3.4-BUG-FIX 31914: ALF-10619: Not all container deletions were being honoured during indexing due to deletionsSinceFlushed processing - If container B is under container A with a secondary association, and A then B were deleted, then not all of Bs containers were getting masked out - only those in a subtree of A! - Now that delete events are fired on every affected node in a cascading delete, we can handle the nodes and containers on an individual basis 31915: ALF-10619: Prevent possible InvalidNodeRefException during reindexing - Handle in childRelationshipEvent() when comparing with 'path generation factor' 32322: Possible fix for: (ALF-11344) SORT clause in CMIS query (ORDER BY) drastically affects performance of search. - English based locales will sort as Java String comparison. 32327: ALF-11495: Merge V3.4.1 (3.4.1.23) to V3.4-BUG-FIX (3.4.8) 32326: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode MaxUserLicenseException class was not added to SVN in previous commit 32325: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode Remove RetryingTransactionInterceptor from around RepositoryDescriptorDAOImpl - not needed any more and caused extra exception if repo was r/o on boot 32324: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode Addition of message to say temporary problem has gone away 32323: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode Think I have found the reason for the vanishing licenses. License Descriptor can be accessed but the file in the content store cannot be read. Tidy up of code to remove TODO messages Addition of MaxUserLicenseException Additional code to handle possible temporary license outages and recovery Addition of more specific exception of invalid content store issue 32326: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode MaxUserLicenseException class was not added to SVN in previous commit 32288: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode Fix tests by still failing if we have not loaded a license yet 32259: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode Ignore exceptions from debug after RepositoryDescriptorDAOImpl Throwable 32252: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode Addition of logging and keep license live after error 32343: FIXED - issue ALF-11617: wma file type is mapped to 'video/x-ms-wma' mimetype instead of audio/x-ms-wma' Changed the type audio/x-ms-wma 32346: Fixed query use-case lookup of assoc namespace entity (i.e. should not lazily create) - Does not need merging to 4.0, which contains the fixes already 32349: Merged V3.3 to V3.4-BUG-FIX 32347: Prevent possible deadlock during subsystem syncing in a cluster 32352: ALF-11495: Merge V3.4.1 (3.4.1.23) to V3.4-BUG-FIX (3.4.8) PLEASE NOTE that ALF-11381 was also merged into V3.4-BUG-FIX in r32327 32350: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode Patched version of true license to log RunTimeExceptions and Errors from both ftp and non-ftp LicenseManager verify methods 32332: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode Cannot throw specific Exception for the file reader issue as returning null is required for FTL 32356: ALF-11495: Merge V3.4.1 (3.4.1.23) to V3.4-BUG-FIX (3.4.8) 32355: ALF-11381 Enterprise unlimited licenses still get invalidated turning the system into read-only mode TrueLicense missing from classpath.unit.test.extension 32387: ALF-11617 Correct mimetype for WMA audio (with patch) 32395: ALF-11004 Workflow Interpreter can now handle transitions with spaces in the name when 'signal' is called. 32398: ALF-11078: Reinstate maxPermissionChecks logging from ALF-7237 32411: Fix for ALF-11344 SORT clause in CMIS query (ORDER BY) drastically affects performance of search. - use in memory sort rather than relying on the lucene field cache for result sets up to 1000 by default - this is configurable across all query languages (and per query via SearchParameters and QueryOptions) lucene.indexer.useInMemorySort=true lucene.indexer.maxRawResultSetSizeForInMemorySort=1000 32425: Fix for ALF-11344 SORT clause in CMIS query (ORDER BY) drastically affects performance of search. - fix for score sorting 32433: Merged V3.4 to V3.4-BUG-FIX 32432: ALF-11743: When RM is installed, admin does not have the appropriate permissions to perform any operations in Alfresco Explorer git-svn-id: https://svn.alfresco.com/repos/alfresco-enterprise/alfresco/HEAD/root@32477 c4b6b30b-aa2e-2d43-bbcb-ca4b014f7261 --- config/alfresco/action-services-context.xml | 10 +- config/alfresco/model/systemModel.xml | 6 + .../alfresco/patch/patch-services-context.xml | 5 + config/alfresco/repository.properties | 9 + .../Search/lucene/lucene-search-context.xml | 20 +- .../Search/lucene/lucene-search.properties | 3 + .../org/alfresco/cmis/CMISQueryOptions.java | 2 + .../org/alfresco/cmis/PropertyFilter.java | 97 ++- .../org/alfresco/cmis/PropertyFilterTest.java | 27 +- .../dictionary/CMISFolderTypeDefinition.java | 13 + .../alfresco/filesys/repo/NodeMonitor.java | 1 - .../RepositoryDescriptorDAOImpl.java | 219 ++++-- .../repo/domain/node/ChildAssocEntity.java | 2 +- .../AlfrescoImapFolderException.java | 2 +- .../subsystems/CompositeDataBean.java | 26 +- .../repo/model/ml/EditionServiceImpl.java | 2 +- .../ml/tools/EditionServiceImplTest.java | 376 +++++++++ .../repo/node/db/DbNodeServiceImpl.java | 5 + .../impl/lucene/ADMLuceneSearcherImpl.java | 20 +- .../search/impl/lucene/ADMLuceneTest.java | 30 +- ...stractLuceneIndexerAndSearcherFactory.java | 36 + .../LuceneAlfrescoFtsQueryLanguage.java | 2 + .../LuceneAlfrescoLuceneQueryLanguage.java | 74 +- .../repo/search/impl/lucene/LuceneConfig.java | 10 + .../search/impl/lucene/LuceneResultSet.java | 16 + .../impl/lucene/LuceneResultSetRow.java | 4 + .../impl/lucene/LuceneQueryEngine.java | 78 +- .../repo/search/results/SortedResultSet.java | 723 ++++++++++++++++-- .../security/authority/AuthorityDAOImpl.java | 12 +- .../repo/version/Version2ServiceImpl.java | 4 + .../repo/workflow/WorkflowInterpreter.java | 22 +- .../workflow/jscript/WorkflowManager.java | 327 ++++---- 32 files changed, 1765 insertions(+), 418 deletions(-) diff --git a/config/alfresco/action-services-context.xml b/config/alfresco/action-services-context.xml index 21d0f5eede..8e72b96293 100644 --- a/config/alfresco/action-services-context.xml +++ b/config/alfresco/action-services-context.xml @@ -13,10 +13,13 @@ defaultAsyncAction - 8 + ${default.async.action.corePoolSize} - 20 + ${default.async.action.maximumPoolSize} + + + ${default.async.action.threadPriority} @@ -32,6 +35,9 @@ ${deployment.service.maximumPoolSize} + + ${deployment.service.threadPriority} + + + NoContent + false + + Archived diff --git a/config/alfresco/patch/patch-services-context.xml b/config/alfresco/patch/patch-services-context.xml index 68eea657de..269004b00d 100644 --- a/config/alfresco/patch/patch-services-context.xml +++ b/config/alfresco/patch/patch-services-context.xml @@ -2873,6 +2873,11 @@ 0 5017 5018 + + + + + true true diff --git a/config/alfresco/repository.properties b/config/alfresco/repository.properties index 9f2feb93ad..8379f9fced 100644 --- a/config/alfresco/repository.properties +++ b/config/alfresco/repository.properties @@ -302,6 +302,8 @@ lucene.write.lock.timeout=10000 lucene.commit.lock.timeout=100000 lucene.lock.poll.interval=100 +lucene.indexer.useInMemorySort=true +lucene.indexer.maxRawResultSetSizeForInMemorySort=1000 lucene.indexer.contentIndexingEnabled=true index.backup.cronExpression=0 0 3 * * ? @@ -309,6 +311,7 @@ index.backup.cronExpression=0 0 3 * * ? lucene.defaultAnalyserResourceBundleName=alfresco/model/dataTypeAnalyzers + # When transforming archive files (.zip etc) into text representations (such as # for full text indexing), should the files within the archive be processed too? # If enabled, transformation takes longer, but searches of the files find more. @@ -629,10 +632,16 @@ subsystems.test.beanProp.value.inst3.anotherStringProperty=Global Instance Defau subsystems.test.simpleProp2=true subsystems.test.simpleProp3=Global Default3 +# Default Async Action Thread Pool +default.async.action.threadPriority=1 +default.async.action.corePoolSize=8 +default.async.action.maximumPoolSize=20 + # Deployment Service deployment.service.numberOfSendingThreads=5 deployment.service.corePoolSize=2 deployment.service.maximumPoolSize=3 +deployment.service.threadPriority=5 # How long to wait in mS before refreshing a target lock - detects shutdown servers deployment.service.targetLockRefreshTime=60000 # How long to wait in mS from the last communication before deciding that deployment has failed, possibly diff --git a/config/alfresco/subsystems/Search/lucene/lucene-search-context.xml b/config/alfresco/subsystems/Search/lucene/lucene-search-context.xml index 6ad7c9d2da..cb0e0ecd17 100644 --- a/config/alfresco/subsystems/Search/lucene/lucene-search-context.xml +++ b/config/alfresco/subsystems/Search/lucene/lucene-search-context.xml @@ -262,8 +262,12 @@ ${lucene.indexer.contentIndexingEnabled} - - + + ${lucene.indexer.useInMemorySort} + + + ${lucene.indexer.maxRawResultSetSizeForInMemorySort} + @@ -315,6 +319,12 @@ + + ${lucene.indexer.useInMemorySort} + + + ${lucene.indexer.maxRawResultSetSizeForInMemorySort} + @@ -405,6 +415,12 @@ + + ${lucene.indexer.useInMemorySort} + + + ${lucene.indexer.maxRawResultSetSizeForInMemorySort} + diff --git a/config/alfresco/subsystems/Search/lucene/lucene-search.properties b/config/alfresco/subsystems/Search/lucene/lucene-search.properties index b3b2040185..923e078547 100644 --- a/config/alfresco/subsystems/Search/lucene/lucene-search.properties +++ b/config/alfresco/subsystems/Search/lucene/lucene-search.properties @@ -47,6 +47,9 @@ lucene.write.lock.timeout=10000 lucene.commit.lock.timeout=100000 lucene.lock.poll.interval=100 +lucene.indexer.useInMemorySort=true +lucene.indexer.maxRawResultSetSizeForInMemorySort=1000 + lucene.indexer.contentIndexingEnabled=true fts.indexer.batchSize=1000 diff --git a/source/java/org/alfresco/cmis/CMISQueryOptions.java b/source/java/org/alfresco/cmis/CMISQueryOptions.java index 885114731d..ce9593b550 100644 --- a/source/java/org/alfresco/cmis/CMISQueryOptions.java +++ b/source/java/org/alfresco/cmis/CMISQueryOptions.java @@ -67,6 +67,8 @@ public class CMISQueryOptions extends QueryOptions options.setMlAnalaysisMode(searchParameters.getMlAnalaysisMode()); options.setLocales(searchParameters.getLocales()); options.setStores(searchParameters.getStores()); + options.setUseInMemorySort(searchParameters.getUseInMemorySort()); + options.setMaxRawResultSetSizeForInMemorySort(searchParameters.getMaxRawResultSetSizeForInMemorySort()); //options.setQuery(); Done on conbstruction //options.setQueryMode(); Should set afterwards options.setQueryParameterDefinitions(searchParameters.getQueryParameterDefinitions()); diff --git a/source/java/org/alfresco/cmis/PropertyFilter.java b/source/java/org/alfresco/cmis/PropertyFilter.java index bd6fb7b5fa..5015b0104e 100644 --- a/source/java/org/alfresco/cmis/PropertyFilter.java +++ b/source/java/org/alfresco/cmis/PropertyFilter.java @@ -18,48 +18,109 @@ */ package org.alfresco.cmis; -import java.util.HashSet; -import java.util.Set; -import java.util.regex.Pattern; - +import java.util.Arrays; +import java.util.List; /** - * Property filter supporting CMIS filter expression + * http://docs.oasis-open.org/cmis/CMIS/v1.0/os/cmis-spec-v1.0.htm + * 2.1.2.1 Property + * All properties MUST supply a String queryName attribute which is used for query and filter operations on object-types. + * This is an opaque String with limitations. This string SHOULD NOT contain any characters that negatively interact with the BNF grammar. * - * @author Dmitry Lazurkin + * The string MUST NOT contain: + * whitespace “ “, + * comma “,” + * double quotes ‘”’ + * single quotes “’” + * backslash “\” + * the period “.” character or, + * the open “(“ or close “)” parenthesis characters. + * + * + * 2.2.1.2.1 Properties + * Description: All of the methods that allow for the retrieval of properties for CMIS Objects have a “Property Filter” + * as an optional parameter, which allows the caller to specify a subset of properties for Objects that MUST be returned by the repository in the output of the method. + * Optional Input Parameter: + * String filter: Value indicating which properties for Objects MUST be returned. Values are: + * - Not set: The set of properties to be returned MUST be determined by the repository. + * - A comma-delimited list of property definition Query Names: The properties listed MUST be returned. + * - “*” : All properties MUST be returned for all objects. + * Repositories SHOULD return only the properties specified in the property filter if they exist on the object’s type definition. + * + * If a property filter specifies a property that is ‘not set’, it MUST be represented as a property element without a value element. + * @author Dmitry Velichkevich + * @author Arseny Kovalchuk */ public class PropertyFilter { public static final String MATCH_ALL_FILTER = "*"; public static final String PROPERTY_NAME_TOKENS_DELIMITER = ","; - private static final Pattern PROPERTY_FILTER_REGEX = Pattern.compile("^([^\\s,\"'\\\\\\.\\(\\)]+)(,[^\\s,\"'\\\\\\.\\(\\)]+)*$"); - private Set properties; + private static final char[] PROPERTY_INVALID_CHARS = { ' ', ',', '"', '\'', '\\', '.', ',', '(', ')' }; + private final List properties; /** * @param filter filter value (case insensitive) - * @throws FilterNotValidException if filter string isn't valid + * @throws CMISFilterNotValidException if filter string isn't valid */ public PropertyFilter(String filter) throws CMISFilterNotValidException + { + properties = validateFilter(filter); + } + + /** + * @param filter to be validated + * @return a list of tokenized and validated properties + * @throws CMISFilterNotValidException if one of the filter tokens is not valid + */ + private static List validateFilter(String filter) throws CMISFilterNotValidException { if (filter != null) { - if (!PROPERTY_FILTER_REGEX.matcher(filter).matches()) - { - throw new CMISFilterNotValidException("Property filter \"" + filter + "\" is invalid"); - } - if (!filter.equals(MATCH_ALL_FILTER)) { String[] tokens = filter.split(PROPERTY_NAME_TOKENS_DELIMITER); - properties = new HashSet(tokens.length * 2); - for (String token : tokens) - { - properties.add(token); + for (int i = 0; i < tokens.length; i++) + { + String token = tokens[i].trim(); + if (token.isEmpty() || token.indexOf('*') != -1 || !isValidToken(token)) + throw new CMISFilterNotValidException("Property filter \"" + filter + "\" is invalid"); + tokens[i] = token; // trimmed } + return Arrays.asList(tokens); + } + else + { + return null; } } + else + { + return null; + } + } + + /** + * Validates particular token within property filter + * + * @param token + * @return true if token is valid + */ + private static boolean isValidToken(String token) + { + if (token == null) + return false; + boolean result = true; + for (char invalidChar : PROPERTY_INVALID_CHARS) + { + if (token.indexOf(invalidChar) != -1) + { + result = false; + break; + } + } + return result; } /** diff --git a/source/java/org/alfresco/cmis/PropertyFilterTest.java b/source/java/org/alfresco/cmis/PropertyFilterTest.java index 27b7f6ef28..d18206d85d 100644 --- a/source/java/org/alfresco/cmis/PropertyFilterTest.java +++ b/source/java/org/alfresco/cmis/PropertyFilterTest.java @@ -22,7 +22,22 @@ import junit.framework.TestCase; /** + * http://docs.oasis-open.org/cmis/CMIS/v1.0/os/cmis-spec-v1.0.htm + * 2.1.2.1 Property + * All properties MUST supply a String queryName attribute which is used for query and filter operations on object-types. + * This is an opaque String with limitations. This string SHOULD NOT contain any characters that negatively interact with the BNF grammar. + * + * The string MUST NOT contain: + * whitespace “ “, + * comma “,” + * double quotes ‘”’ + * single quotes “’” + * backslash “\” + * the period “.” character or, + * the open “(“ or close “)” parenthesis characters. + * * @author Dmitry Velichkevich + * @author Arseny Kovalchuk */ public class PropertyFilterTest extends TestCase { @@ -33,16 +48,15 @@ public class PropertyFilterTest extends TestCase private static final String VALID_MATCHE_ALL_FILTER = "*"; private static final String VALID_FILTER_WITH_NAME = NAME_TOKEN; - private static final String VALID_FILTER_WITH_SEVERAL_TOKENS = "name,objectId"; - private static final String LONG_VALID_FILTER_WITH_SEVERAL_TOKENS = "objectId,name,CreationDate*,Created;By"; + private static final String LONG_VALID_FILTER_WITH_SEVERAL_TOKENS = "objectId,name,CreationDate,Created;By"; + private static final String VALID_FILTER_CMIS_WORKBANCH_ALFRESCO_3_4 = "cmis:parentId, cmis:objectId, name, objectId"; + private static final String VALID_FILTER_WITH_SPACES = " name, objectId,CreationDate, CreatedBy , ModifiedBy , LastModifiedBy "; private static final String INVALID_MATCHE_ALL_FILTER = "*,"; private static final String INVALID_FILTER_WITH_NAME = "*name,"; private static final String INVALID_FILTER_WITH_SEVERAL_TOKENS = "name,,objectId"; private static final String LONG_INVALID_FILTER_WITH_SEVERAL_TOKENS = "objectId, name CreationDate, CreatedBy*"; private static final String INVALID_FILTER_WITH_SEVERAL_TOKENS_WITHOUT_BREAKS = ",name,objectId,CreationDate"; - private static final String INVALID_FILTER_WITH_SEVERAL_TOKENS_AND_WITH_BREAKS_IN_SOME_PLACES = " name, objectId,CreationDate CreatedBy ModifiedBy, LastModifiedBy"; - private static final String INVALID_FILTER_WITH_FIRST_BREAK_SYMBOL = " name, objectId,CreationDate, CreatedBy, ModifiedBy, LastModifiedBy"; private static final String INVALID_FILTER_WITH_DENIED_SYMBOL = "objectId\"name"; private static final String INVALID_FILTER_WITH_LAST_INVALID_SYMBOL = "objectId,name\\"; @@ -53,8 +67,9 @@ public class PropertyFilterTest extends TestCase onlyNameTokensAssertionValid(new PropertyFilter(VALID_FILTER_WITH_NAME)); - nameAndObjectIdTokensAssertionValid(new PropertyFilter(VALID_FILTER_WITH_SEVERAL_TOKENS)); nameAndObjectIdTokensAssertionValid(new PropertyFilter(LONG_VALID_FILTER_WITH_SEVERAL_TOKENS)); + nameAndObjectIdTokensAssertionValid(new PropertyFilter(VALID_FILTER_CMIS_WORKBANCH_ALFRESCO_3_4)); + nameAndObjectIdTokensAssertionValid(new PropertyFilter(VALID_FILTER_WITH_SPACES)); } public void testInvalidFilters() throws Exception @@ -64,8 +79,6 @@ public class PropertyFilterTest extends TestCase invalidFilterAssertion(INVALID_FILTER_WITH_SEVERAL_TOKENS); invalidFilterAssertion(LONG_INVALID_FILTER_WITH_SEVERAL_TOKENS); invalidFilterAssertion(INVALID_FILTER_WITH_SEVERAL_TOKENS_WITHOUT_BREAKS); - invalidFilterAssertion(INVALID_FILTER_WITH_SEVERAL_TOKENS_AND_WITH_BREAKS_IN_SOME_PLACES); - invalidFilterAssertion(INVALID_FILTER_WITH_FIRST_BREAK_SYMBOL); invalidFilterAssertion(INVALID_FILTER_WITH_DENIED_SYMBOL); invalidFilterAssertion(INVALID_FILTER_WITH_LAST_INVALID_SYMBOL); } diff --git a/source/java/org/alfresco/cmis/dictionary/CMISFolderTypeDefinition.java b/source/java/org/alfresco/cmis/dictionary/CMISFolderTypeDefinition.java index 0713768454..d858a068b9 100644 --- a/source/java/org/alfresco/cmis/dictionary/CMISFolderTypeDefinition.java +++ b/source/java/org/alfresco/cmis/dictionary/CMISFolderTypeDefinition.java @@ -37,6 +37,8 @@ public class CMISFolderTypeDefinition extends CMISAbstractTypeDefinition { private static final long serialVersionUID = 7526155195125799106L; + protected final boolean fileable = true; + /** * Construct * @param cmisMapping @@ -82,6 +84,16 @@ public class CMISFolderTypeDefinition extends CMISAbstractTypeDefinition includedInSuperTypeQuery = cmisClassDef.getIncludedInSuperTypeQuery(); } + /** + * Are objects of this type fileable? + * + * @return + */ + public boolean isFileable() + { + return fileable; + } + /* * (non-Javadoc) * @see java.lang.Object#toString() @@ -104,6 +116,7 @@ public class CMISFolderTypeDefinition extends CMISAbstractTypeDefinition builder.append("IncludedInSuperTypeQuery=").append(isIncludedInSuperTypeQuery()).append(", "); builder.append("ControllablePolicy=").append(isControllablePolicy()).append(", "); builder.append("ControllableACL=").append(isControllableACL()).append(", "); + builder.append("Fileable=").append(isFileable()).append(", "); builder.append("SubTypes=").append(getSubTypes(false).size()).append(", "); builder.append("Properties=").append(getPropertyDefinitions().size()); builder.append("]"); diff --git a/source/java/org/alfresco/filesys/repo/NodeMonitor.java b/source/java/org/alfresco/filesys/repo/NodeMonitor.java index d1c1d0788c..9ab87ff6d3 100644 --- a/source/java/org/alfresco/filesys/repo/NodeMonitor.java +++ b/source/java/org/alfresco/filesys/repo/NodeMonitor.java @@ -325,7 +325,6 @@ public class NodeMonitor extends TransactionListenerAdapter // Get the full path to the file/folder node - Path nodePath = m_nodeService.getPath( oldNodeRef); String fName = (String) m_nodeService.getProperty( oldNodeRef, ContentModel.PROP_NAME); // Build the share relative path to the node diff --git a/source/java/org/alfresco/repo/descriptor/RepositoryDescriptorDAOImpl.java b/source/java/org/alfresco/repo/descriptor/RepositoryDescriptorDAOImpl.java index 2265362223..90bd291f85 100644 --- a/source/java/org/alfresco/repo/descriptor/RepositoryDescriptorDAOImpl.java +++ b/source/java/org/alfresco/repo/descriptor/RepositoryDescriptorDAOImpl.java @@ -1,5 +1,5 @@ /* - * Copyright (C) 2005-2010 Alfresco Software Limited. + * Copyright (C) 2005-2011 Alfresco Software Limited. * * This file is part of Alfresco * @@ -20,6 +20,7 @@ package org.alfresco.repo.descriptor; import java.io.ByteArrayInputStream; import java.io.ByteArrayOutputStream; +import java.io.IOException; import java.io.InputStream; import java.io.Serializable; import java.util.Collection; @@ -145,66 +146,107 @@ public class RepositoryDescriptorDAOImpl implements DescriptorDAO @Override public Descriptor getDescriptor() { - // retrieve system descriptor - final NodeRef descriptorNodeRef = getDescriptorNodeRef(false); - - // create appropriate descriptor - if (descriptorNodeRef != null) + Descriptor descriptor = null; + try { - final Map properties = this.nodeService.getProperties(descriptorNodeRef); - return new RepositoryDescriptor(properties); + // retrieve system descriptor + final NodeRef descriptorNodeRef = getDescriptorNodeRef(false); + + // create appropriate descriptor + if (descriptorNodeRef != null) + { + final Map properties = this.nodeService.getProperties(descriptorNodeRef); + descriptor = new RepositoryDescriptor(properties); + } } - return null; + catch (final RuntimeException e) + { + if (logger.isErrorEnabled()) + { + logger.error("getDescriptor: ", e); + } + throw e; + } + catch (final Error e) + { + if (logger.isErrorEnabled()) + { + logger.error("getDescriptor: ", e); + } + throw e; + } + return descriptor; } @Override public Descriptor updateDescriptor(final Descriptor serverDescriptor, LicenseMode licenseMode) { - final NodeRef currentDescriptorNodeRef = getDescriptorNodeRef(true); - // if the node is missing but it should have been created - if (currentDescriptorNodeRef == null) + Descriptor descriptor = null; + try { - return null; - } - // set the properties - if (!this.transactionService.isReadOnly()) - { - Map props = new HashMap(11); - props.put(ContentModel.PROP_SYS_NAME, serverDescriptor.getName()); - props.put(ContentModel.PROP_SYS_VERSION_MAJOR, serverDescriptor.getVersionMajor()); - props.put(ContentModel.PROP_SYS_VERSION_MINOR, serverDescriptor.getVersionMinor()); - props.put(ContentModel.PROP_SYS_VERSION_REVISION, serverDescriptor.getVersionRevision()); - props.put(ContentModel.PROP_SYS_VERSION_LABEL, serverDescriptor.getVersionLabel()); - props.put(ContentModel.PROP_SYS_VERSION_BUILD, serverDescriptor.getVersionBuild()); - props.put(ContentModel.PROP_SYS_VERSION_SCHEMA, serverDescriptor.getSchema()); + final NodeRef currentDescriptorNodeRef = getDescriptorNodeRef(true); + // if the node is missing but it should have been created + if (currentDescriptorNodeRef == null) + { + return null; + } + // set the properties + if (!this.transactionService.isReadOnly()) + { + Map props = new HashMap(11); + props.put(ContentModel.PROP_SYS_NAME, serverDescriptor.getName()); + props.put(ContentModel.PROP_SYS_VERSION_MAJOR, serverDescriptor.getVersionMajor()); + props.put(ContentModel.PROP_SYS_VERSION_MINOR, serverDescriptor.getVersionMinor()); + props.put(ContentModel.PROP_SYS_VERSION_REVISION, serverDescriptor.getVersionRevision()); + props.put(ContentModel.PROP_SYS_VERSION_LABEL, serverDescriptor.getVersionLabel()); + props.put(ContentModel.PROP_SYS_VERSION_BUILD, serverDescriptor.getVersionBuild()); + props.put(ContentModel.PROP_SYS_VERSION_SCHEMA, serverDescriptor.getSchema()); props.put(ContentModel.PROP_SYS_LICENSE_MODE, licenseMode); - this.nodeService.addProperties(currentDescriptorNodeRef, props); + this.nodeService.addProperties(currentDescriptorNodeRef, props); - // ALF-726: v3.1.x Content Cleaner Job needs to be ported to v3.2 - // In order to migrate properly, this property needs to be d:content. We will rewrite the property with the - // license update code. There is no point attempting to rewrite the property here. - final Serializable value = this.nodeService.getProperty( - currentDescriptorNodeRef, - ContentModel.PROP_SYS_VERSION_EDITION); - if (value == null) - { - this.nodeService.setProperty( + // ALF-726: v3.1.x Content Cleaner Job needs to be ported to v3.2 + // In order to migrate properly, this property needs to be d:content. We will rewrite the property with the + // license update code. There is no point attempting to rewrite the property here. + final Serializable value = this.nodeService.getProperty( currentDescriptorNodeRef, - ContentModel.PROP_SYS_VERSION_EDITION, - new ContentData(null, null, 0L, null)); + ContentModel.PROP_SYS_VERSION_EDITION); + if (value == null) + { + this.nodeService.setProperty( + currentDescriptorNodeRef, + ContentModel.PROP_SYS_VERSION_EDITION, + new ContentData(null, null, 0L, null)); + } + + // done + if (RepositoryDescriptorDAOImpl.logger.isDebugEnabled()) + { + RepositoryDescriptorDAOImpl.logger.debug("Updated current repository descriptor properties: \n" + + " node: " + currentDescriptorNodeRef + "\n" + " descriptor: " + serverDescriptor); + } } - // done - if (RepositoryDescriptorDAOImpl.logger.isDebugEnabled()) - { - RepositoryDescriptorDAOImpl.logger.debug("Updated current repository descriptor properties: \n" - + " node: " + currentDescriptorNodeRef + "\n" + " descriptor: " + serverDescriptor); - } + final Map properties = this.nodeService.getProperties(currentDescriptorNodeRef); + descriptor = new RepositoryDescriptor(properties); } - - final Map properties = this.nodeService.getProperties(currentDescriptorNodeRef); - return new RepositoryDescriptor(properties); + catch (final RuntimeException e) + { + if (logger.isErrorEnabled()) + { + logger.error("updateDescriptor: ", e); + } + throw e; + } + catch (final Error e) + { + if (logger.isErrorEnabled()) + { + logger.error("updateDescriptor: ", e); + } + throw e; + } + return descriptor; } @Override @@ -217,21 +259,77 @@ public class RepositoryDescriptorDAOImpl implements DescriptorDAO final NodeRef descriptorRef = getDescriptorNodeRef(true); if (descriptorRef == null) { + // Should not get this as 'true' was used. throw new LicenseException("Failed to find system descriptor"); } + if (logger.isDebugEnabled()) + { + logger.debug("getLicenseKey: descriptorRef=" + descriptorRef); + } + final ContentReader reader = this.contentService.getReader( descriptorRef, ContentModel.PROP_SYS_VERSION_EDITION); - if (reader != null && reader.exists()) + + boolean exists = reader != null && reader.exists(); + if (exists) { - final ByteArrayOutputStream os = new ByteArrayOutputStream(); - reader.getContent(os); - key = os.toByteArray(); + ByteArrayOutputStream os = null; + try + { + os = new ByteArrayOutputStream(); + reader.getContent(os); + key = os.toByteArray(); + } + finally + { + if (os != null) + { + try + { + os.close(); + } + catch (IOException ignore) + { + // We have more to worry about if we ever get here. + logger.debug("getLicenseKey: Error closing ByteArrayOutputStream", ignore); + } + } + } + } + else + { + if (logger.isDebugEnabled()) + { + // reader should never be null. An exception is thrown by getReader if it is. + logger.debug("getLicenseKey: reader=" + reader + (reader == null ? "" : " exists=" + exists)); + } } } - catch (final Exception e) + catch (final LicenseException e) { - throw new LicenseException("Failed to load license key: " + e.getMessage(), e); + throw e; + } + catch (final RuntimeException e) + { + if (logger.isDebugEnabled()) + { + logger.debug("getLicenseKey: ", e); + } + throw new LicenseException("Failed to load license", e); + } + catch (final Error e) + { + if (logger.isDebugEnabled()) + { + logger.debug("getLicenseKey: ", e); + } + throw e; + } + + if (logger.isDebugEnabled()) + { + logger.debug("getLicenseKey: key " + (key == null ? "is null" : "length=" + key.length)); } return key; } @@ -244,6 +342,7 @@ public class RepositoryDescriptorDAOImpl implements DescriptorDAO final NodeRef descriptorRef = getDescriptorNodeRef(true); if (descriptorRef == null) { + // Should not get this as 'true' was used. throw new LicenseException("Failed to find system descriptor"); } if (key == null) @@ -260,9 +359,21 @@ public class RepositoryDescriptorDAOImpl implements DescriptorDAO writer.putContent(is); } } - catch (final Exception e) + catch (final RuntimeException e) { - throw new LicenseException("Failed to save license: " + e.getMessage(), e); + if (logger.isDebugEnabled()) + { + logger.debug("getLicenseKey: ", e); + } + throw new LicenseException("Failed to save license", e); + } + catch (final Error e) + { + if (logger.isDebugEnabled()) + { + logger.debug("getLicenseKey: ", e); + } + throw e; } } diff --git a/source/java/org/alfresco/repo/domain/node/ChildAssocEntity.java b/source/java/org/alfresco/repo/domain/node/ChildAssocEntity.java index 3d56a600cc..9f9a059800 100644 --- a/source/java/org/alfresco/repo/domain/node/ChildAssocEntity.java +++ b/source/java/org/alfresco/repo/domain/node/ChildAssocEntity.java @@ -390,7 +390,7 @@ public class ChildAssocEntity } else { - Pair nsPair = qnameDAO.getOrCreateNamespace(assocQNameNamespace); + Pair nsPair = qnameDAO.getNamespace(assocQNameNamespace); if (nsPair == null) { // We can't set anything diff --git a/source/java/org/alfresco/repo/imap/exception/AlfrescoImapFolderException.java b/source/java/org/alfresco/repo/imap/exception/AlfrescoImapFolderException.java index 2ec9ea377c..7fa76d19da 100644 --- a/source/java/org/alfresco/repo/imap/exception/AlfrescoImapFolderException.java +++ b/source/java/org/alfresco/repo/imap/exception/AlfrescoImapFolderException.java @@ -30,7 +30,7 @@ public class AlfrescoImapFolderException extends FolderException private static final long serialVersionUID = -2721708848846740336L; - public final static String PERMISSION_DENIED = "Can't create folder - Permission denied"; + public final static String PERMISSION_DENIED = "Cannot perform action - Permission denied"; public AlfrescoImapFolderException(String message) { diff --git a/source/java/org/alfresco/repo/management/subsystems/CompositeDataBean.java b/source/java/org/alfresco/repo/management/subsystems/CompositeDataBean.java index 2c5442314c..99fb695752 100644 --- a/source/java/org/alfresco/repo/management/subsystems/CompositeDataBean.java +++ b/source/java/org/alfresco/repo/management/subsystems/CompositeDataBean.java @@ -216,7 +216,31 @@ public class CompositeDataBean extends AbstractPropertyBackedBean // Ensure any edits to child composites cause the parent to be shut down and subsequently re-initialized if (broadcast) { - this.owner.stop(); + // Avoid holding this object's lock to prevent potential deadlock with parent + boolean hadWriteLock = this.lock.isWriteLockedByCurrentThread(); + if (hadWriteLock) + { + this.lock.writeLock().unlock(); + } + else + { + this.lock.readLock().unlock(); + } + try + { + this.owner.stop(); + } + finally + { + if (hadWriteLock) + { + this.lock.writeLock().lock(); + } + else + { + this.lock.readLock().lock(); + } + } } } diff --git a/source/java/org/alfresco/repo/model/ml/EditionServiceImpl.java b/source/java/org/alfresco/repo/model/ml/EditionServiceImpl.java index 08edd36327..ff3eb02484 100644 --- a/source/java/org/alfresco/repo/model/ml/EditionServiceImpl.java +++ b/source/java/org/alfresco/repo/model/ml/EditionServiceImpl.java @@ -112,7 +112,7 @@ public class EditionServiceImpl implements EditionService } // get the properties to add to the edition history - addPropertiesToVersion(versionProperties, mlContainerToVersion); + addPropertiesToVersion(versionProperties, startingTranslationNodeRef); // Version the container and its translations versionService.createVersion(mlContainerToVersion, versionProperties, true); diff --git a/source/java/org/alfresco/repo/model/ml/tools/EditionServiceImplTest.java b/source/java/org/alfresco/repo/model/ml/tools/EditionServiceImplTest.java index db8a111cdf..f24e0cceb1 100644 --- a/source/java/org/alfresco/repo/model/ml/tools/EditionServiceImplTest.java +++ b/source/java/org/alfresco/repo/model/ml/tools/EditionServiceImplTest.java @@ -20,13 +20,18 @@ package org.alfresco.repo.model.ml.tools; import java.io.Serializable; import java.util.ArrayList; +import java.util.Collection; import java.util.HashMap; import java.util.List; import java.util.Locale; import java.util.Map; +import java.util.Set; import org.alfresco.model.ContentModel; import org.alfresco.repo.version.VersionModel; +import org.alfresco.service.cmr.repository.ContentReader; +import org.alfresco.service.cmr.repository.ContentService; +import org.alfresco.service.cmr.repository.ContentWriter; import org.alfresco.service.cmr.repository.NodeRef; import org.alfresco.service.cmr.version.Version; import org.alfresco.service.cmr.version.VersionHistory; @@ -44,6 +49,15 @@ public class EditionServiceImplTest extends AbstractMultilingualTestCases private static String CHINESE_CONTENT = "CHINESE_CONTENT"; private static String JAPANESE_CONTENT = "JAPANESE_CONTENT"; + private ContentService contentService; + + @Override + protected void setUp() throws Exception + { + super.setUp(); + contentService = serviceRegistry.getContentService(); + } + public void testAutoEdition() throws Exception { // create a mlContainer with some content @@ -161,6 +175,92 @@ public class EditionServiceImplTest extends AbstractMultilingualTestCases public void testReadVersionedProperties() throws Exception { + } + + //ALF-6275 + public void testEditionServiceWithContent() + { + // create a mlContainer with some content + NodeRef mlContainerNodeRef = createMLContainerWithContent("0.1"); + // get the french translation + NodeRef frenchContentNodeRef = multilingualContentService.getTranslationForLocale(mlContainerNodeRef, + Locale.FRENCH); + checkContentVersionValuesForEditions(mlContainerNodeRef); + // create a new edition starting from the french translation + NodeRef newStartingPoint = editionService.createEdition(frenchContentNodeRef, null); + frenchContentNodeRef = multilingualContentService.getTranslationForLocale(mlContainerNodeRef, Locale.FRENCH); + modifyContent(frenchContentNodeRef, FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "0.2" + "-" + "0.2"); + checkContentVersionValuesForEditions(mlContainerNodeRef); + modifyContent(frenchContentNodeRef, FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "0.2" + "-" + "0.3"); + checkContentVersionValuesForEditions(mlContainerNodeRef); + modifyContent(frenchContentNodeRef, FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "0.2" + "-" + "0.4"); + // checkContentVersionValuesForEditions(mlContainerNodeRef); + frenchContentNodeRef = multilingualContentService.getTranslationForLocale(mlContainerNodeRef, Locale.FRENCH); + + newStartingPoint = editionService.createEdition(frenchContentNodeRef, null); + frenchContentNodeRef = multilingualContentService.getTranslationForLocale(mlContainerNodeRef, Locale.FRENCH); + modifyContent(frenchContentNodeRef, FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "0.3" + "-" + "0.2"); + checkContentVersionValuesForEditions(mlContainerNodeRef); + modifyContent(frenchContentNodeRef, FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "0.3" + "-" + "0.3"); + checkContentVersionValuesForEditions(mlContainerNodeRef); + NodeRef chineseContentNodeRef = createContent(CHINESE_CONTENT + "-" + Locale.CHINESE + "-" + "0.3" + "-0.1"); + multilingualContentService.addTranslation(chineseContentNodeRef, mlContainerNodeRef, Locale.CHINESE); + checkContentVersionValuesForEditions(mlContainerNodeRef); + NodeRef japaneseContentNodeRef = createContent(JAPANESE_CONTENT + "-" + Locale.JAPANESE + "-" + "0.3" + "-0.1"); + multilingualContentService.addTranslation(japaneseContentNodeRef, mlContainerNodeRef, Locale.JAPANESE); + checkContentVersionValuesForEditions(mlContainerNodeRef); + + japaneseContentNodeRef = multilingualContentService + .getTranslationForLocale(mlContainerNodeRef, Locale.JAPANESE); + japaneseContentNodeRef = editionService.createEdition(japaneseContentNodeRef, null); + checkContentVersionValuesForEditions(mlContainerNodeRef); + japaneseContentNodeRef = multilingualContentService + .getTranslationForLocale(mlContainerNodeRef, Locale.JAPANESE); + modifyContent(japaneseContentNodeRef, JAPANESE_CONTENT + "-" + Locale.JAPANESE + "-" + "0.4" + "-0.2"); + chineseContentNodeRef = createContent(CHINESE_CONTENT + "-" + Locale.CHINESE + "-" + "0.4" + "-0.1"); + + multilingualContentService.addTranslation(chineseContentNodeRef, mlContainerNodeRef, Locale.CHINESE); + checkContentVersionValuesForEditions(mlContainerNodeRef); + frenchContentNodeRef = createContent(FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "0.4" + "-0.1"); + multilingualContentService.addTranslation(frenchContentNodeRef, mlContainerNodeRef, Locale.FRENCH); + checkContentVersionValuesForEditions(mlContainerNodeRef); + + frenchContentNodeRef = multilingualContentService.getTranslationForLocale(mlContainerNodeRef, Locale.FRENCH); + newStartingPoint = editionService.createEdition(frenchContentNodeRef, null); + frenchContentNodeRef = multilingualContentService.getTranslationForLocale(mlContainerNodeRef, Locale.FRENCH); + modifyContent(frenchContentNodeRef, FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "0.5" + "-" + "0.2"); + checkContentVersionValuesForEditions(mlContainerNodeRef); + + + japaneseContentNodeRef = multilingualContentService.getTranslationForLocale(mlContainerNodeRef, Locale.JAPANESE); + HashMap versionProperties = new HashMap(); + versionProperties.put(VersionModel.PROP_VERSION_TYPE, VersionType.MAJOR); + japaneseContentNodeRef = editionService.createEdition(japaneseContentNodeRef, versionProperties); + + Collection editions = editionService.getEditions(mlContainerNodeRef).getAllVersions(); + Version secondEdition = editions.iterator().next(); + // Ensure that the version label is 2.0 + assertTrue("The edition label would be 2.0 and not " + secondEdition.getVersionLabel(), secondEdition + .getVersionLabel().equals("1.0")); + frenchContentNodeRef = createContent(FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "1.0" + "-0.1"); + modifyContent(frenchContentNodeRef, FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "1.0" + "-" + "0.2"); + checkContentVersionValuesForEditions(mlContainerNodeRef); + + frenchContentNodeRef = multilingualContentService.getTranslationForLocale(mlContainerNodeRef, Locale.FRENCH); + + versionProperties = new HashMap(); + versionProperties.put(VersionModel.PROP_VERSION_TYPE, VersionType.MAJOR); + frenchContentNodeRef = editionService.createEdition(frenchContentNodeRef, versionProperties); + + editions = editionService.getEditions(mlContainerNodeRef).getAllVersions(); + secondEdition = editions.iterator().next(); + // Ensure that the version label is 2.0 + assertTrue("The edition label would be 3.0 and not " + secondEdition.getVersionLabel(), secondEdition + .getVersionLabel().equals("2.0")); + modifyContent(frenchContentNodeRef, FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + "2.0" + "-" + "0.2"); + checkContentVersionValuesForEditions(mlContainerNodeRef); + + } private void checkFirstVersion(NodeRef mlContainerNodeRef) @@ -200,4 +300,280 @@ public class EditionServiceImplTest extends AbstractMultilingualTestCases return multilingualContentService.getTranslationContainer(chineseContentNodeRef); } + + private void checkContentVersionValuesForEditions(NodeRef mlContainerNodeRef) + { + // the convention applied for this test is that the content MUST end up with + // _LOCALE_EDITION-VERION_INDIVIDUAL-VERSION + // get the edition list of edition + VersionHistory editionHistory = editionService.getEditions(mlContainerNodeRef); + + // Ensure that the first edition of the mlContainer is created + assertNotNull("The edition history can't be null", editionHistory); + // Ensure that it contains only one version + assertTrue("The edition history must contain only one edition", editionHistory.getAllVersions().size() >= 1); + + // iterate on editions + for (Version editionVersion : editionHistory.getAllVersions()) + { + // getting the edition label + String editionLabel = editionVersion.getVersionLabel(); + // check if it is the head version + if (editionHistory.getHeadVersion() == editionVersion) + { + // this is the living edition + System.out.println("Head version edition:" + editionLabel); + // dump content of head edition + Map translations = multilingualContentService.getTranslations(mlContainerNodeRef); + Locale pivotLocale = (Locale) nodeService.getProperty(mlContainerNodeRef, ContentModel.PROP_LOCALE); + Set keySet = translations.keySet(); + for (Locale locale : keySet) + { + NodeRef translatedNode = translations.get(locale); + // get content as string and compare + ContentReader reader = contentService.getReader(translatedNode, ContentModel.PROP_CONTENT); + String liveContent = reader.getContentString(); + Triplet parsedTriplet = new Triplet(liveContent); + System.out.println("Content:" + liveContent); + + // get all the version of current translation + VersionHistory versionHistory = versionService.getVersionHistory(translatedNode); + for (Version version : versionHistory.getAllVersions()) + { + NodeRef frozenNodeRef = version.getFrozenStateNodeRef(); + String versionLabel = version.getVersionLabel(); + Locale versionLocale = (Locale) nodeService + .getProperty(frozenNodeRef, ContentModel.PROP_LOCALE); + // get content as string and compare + reader = contentService.getReader(frozenNodeRef, ContentModel.PROP_CONTENT); + String versionnedContent = reader.getContentString(); + System.out.println("Individual version " + versionLabel + ":" + versionnedContent); + if (versionService.getCurrentVersion(translatedNode).getFrozenStateNodeRef().equals( + version.getFrozenStateNodeRef())) + { + // this is the head version of the translation therefore content should be equal + assertTrue( + "The content in head version should be equal to the content of the translation:", + versionnedContent.equals(liveContent)); + } + // checking if content respects conventions XXX*locale*edition_version_*document_version + // the exception should be the version used to start the new edition with + // exception exist for root version because root version can be the first created + // and if is the pivot language then its content is a copy of the previous edition. + // This breaks the conventions and other checks must be done + if ((versionHistory.getRootVersion().getFrozenStateNodeRef().equals(version + .getFrozenStateNodeRef())) + && (pivotLocale.equals(versionLocale))) + { + System.out.println("Some special on live version has to be done:" + versionnedContent); + // get previous edition + Version previousEditionVersion = editionHistory.getPredecessor(editionVersion); + + if (previousEditionVersion != null) + { + String previousEditionLabel = previousEditionVersion.getVersionLabel(); + System.out.println("Current edition Label:" + editionLabel + " Previous edition label:" + + previousEditionLabel); + List versionTranslations = editionService + .getVersionedTranslations(previousEditionVersion); + // for all languages iterate to find the corresponding language + for (VersionHistory versionTranslation : versionTranslations) + { + // most recent first + Version newestVersion = versionTranslation.getHeadVersion(); + NodeRef newestVersionFrozenNodeRef = newestVersion.getFrozenStateNodeRef(); + String newestVersionVersionLabel = newestVersion.getVersionLabel(); + ContentReader readerContentWeStartedWith = contentService.getReader( + newestVersionFrozenNodeRef, ContentModel.PROP_CONTENT); + Locale oldestVersionLocale = (Locale) nodeService.getProperty( + newestVersionFrozenNodeRef, ContentModel.PROP_LOCALE); + String contentWeStartedWith = readerContentWeStartedWith.getContentString(); + System.out.println("CONTENT:" + contentWeStartedWith); + if (versionLocale.equals(oldestVersionLocale)) + { + // content should match + assertTrue( + "The content in head version should be equal to the content we started with:", + contentWeStartedWith.equals(versionnedContent)); + } + } + } + + } + else + { + // it is not a root version therefore it should respect the conventions + Triplet testTriplet = new Triplet(versionnedContent); + assertTrue(testTriplet.locale.equals(versionLocale.toString())); + assertTrue(testTriplet.edition.equals(editionLabel)); + assertTrue(testTriplet.version.equals(versionLabel)); + } + + } + } + } + else + { + // get pivot language of the current versionned edition + // This is not the current/head edition + Version nextEditionVersion = editionHistory.getSuccessors(editionVersion).iterator().next(); + + //get Next verion label + String nextEditionLabel = nextEditionVersion.getVersionLabel(); + System.out.println("Edition:" + editionLabel + " Next edition label:" + nextEditionLabel); + // get the translations of the version + List versionTranslations = editionService.getVersionedTranslations(editionVersion); + // iterate on versionTranslations (all languages) + + //strange that we have to go to the next edition to find the current pivot language. + //maybe there is a reason for that but not logical logical + dumpFrozenMetaData(editionVersion); + Locale editionPivotlanguage = (Locale) (editionService.getVersionedMetadatas(editionVersion) + .get(ContentModel.PROP_LOCALE)); + System.out.println("Edition:" + editionLabel + " Previous pivot language:" + editionPivotlanguage + " Current pivot language:" + editionPivotlanguage); + for (VersionHistory versionTranslation : versionTranslations) + { + Collection versions = versionTranslation.getAllVersions(); + // for a language, iterate on all versions + for (Version version : versions) + { + NodeRef frozenNodeRef = version.getFrozenStateNodeRef(); + String versionLabel = version.getVersionLabel(); + // get content language + Locale currentVersionLocale = (Locale) nodeService.getProperty(frozenNodeRef, + ContentModel.PROP_LOCALE); + System.out.println("Current version locale:" + currentVersionLocale + + " Previous edition locale:" + editionPivotlanguage); + // get content as string and compare + ContentReader reader = contentService.getReader(frozenNodeRef, ContentModel.PROP_CONTENT); + String content = reader.getContentString(); + System.out.println("Content:" + content); + // checking content respects conventions XXX*locale*edition_version_*document_version + // the exception should be the version used to start the new edition with + Version initialVersion = versionTranslation.getRootVersion(); + if (initialVersion.equals(version) && currentVersionLocale.equals(editionPivotlanguage)) + { + System.out.println("Some special test has to be done:" + content); + Version previousEditionVersion = editionHistory.getPredecessor(editionVersion); + + if (previousEditionVersion != null) + { + String previousEditionLabel = previousEditionVersion.getVersionLabel(); + System.out.println("Current edition Label:" + editionLabel + " Previous edition label:" + + previousEditionLabel); + List versionTranslations2 = editionService + .getVersionedTranslations(previousEditionVersion); + // for all languages iterate to find the corresponding language + for (VersionHistory versionTranslation2 : versionTranslations2) + { + // most recent first + Version newestVersion = versionTranslation2.getHeadVersion(); + NodeRef newestVersionFrozenNodeRef = newestVersion.getFrozenStateNodeRef(); + String newestVersionVersionLabel = newestVersion.getVersionLabel(); + ContentReader readerContentWeStartedWith = contentService.getReader( + newestVersionFrozenNodeRef, ContentModel.PROP_CONTENT); + Locale oldestVersionLocale = (Locale) nodeService.getProperty( + newestVersionFrozenNodeRef, ContentModel.PROP_LOCALE); + String contentWeStartedWith = readerContentWeStartedWith.getContentString(); + System.out.println("CONTENT:" + contentWeStartedWith); + if (currentVersionLocale.equals(oldestVersionLocale)) + { + // content should match + assertTrue( + "The content in head version should be equal to the content we started with:", + contentWeStartedWith.equals(content)); + } + } + } + else + { + // normal invariant here because it is not the initial version + Triplet testTriplet = new Triplet(content); + assertTrue(testTriplet.locale.equals(currentVersionLocale.toString())); + assertTrue(testTriplet.edition.equals(editionLabel)); + assertTrue(testTriplet.version.equals(versionLabel)); + } + } + } + } + } + + } + } + private + NodeRef createMLContainerWithContent(String editionSuffix) + { + NodeRef chineseContentNodeRef = createContent(CHINESE_CONTENT + "-" + Locale.CHINESE + "-" + editionSuffix + + "-0.1"); + NodeRef frenchContentNodeRef = createContent(FRENCH_CONTENT + "-" + Locale.FRENCH + "-" + editionSuffix + + "-0.1"); + NodeRef japaneseContentNodeRef = createContent(JAPANESE_CONTENT + "-" + Locale.JAPANESE + "-" + editionSuffix + + "-0.1"); + + multilingualContentService.makeTranslation(chineseContentNodeRef, Locale.CHINESE); + multilingualContentService.addTranslation(frenchContentNodeRef, chineseContentNodeRef, Locale.FRENCH); + multilingualContentService.addTranslation(japaneseContentNodeRef, chineseContentNodeRef, Locale.JAPANESE); + + return multilingualContentService.getTranslationContainer(chineseContentNodeRef); + } + + private void modifyContent(NodeRef nodeRef, String value) + { + ContentWriter contentWriter = contentService.getWriter(nodeRef, ContentModel.PROP_CONTENT, true); + contentWriter.putContent(value); + } + + private void dumpFrozenMetaData(Version editionVersion) + { + System.out.println("---------------------------------------------------"); + //Get current version label + System.out.println("Version Label: " + editionVersion.getVersionLabel()); + System.out.println("---------------------------------------------------"); + //Map mapOfFrozenProps = editionService.getVersionedMetadatas(editionVersion); + Map mapOfFrozenProps = editionVersion.getVersionProperties(); + if(mapOfFrozenProps == null ) + { + System.out.println("Nul... "); + return; + } + + for(String q: mapOfFrozenProps.keySet()) + { + String val = mapOfFrozenProps.get(q)==null?"null":mapOfFrozenProps.get(q).toString(); + System.out.println("QName:" + q + ":" + val); + } + } + + + /** + * Parse the content to extract the local,edition lablel,version label + * + * @author Philippe Dubois + */ + private class Triplet + { + public String locale; + public String edition; + public String version; + + public Triplet(String content) + { + String[] tokens = content.split("-"); + locale = tokens[1]; + edition = tokens[2]; + version = tokens[3]; + } + + } + + protected NodeRef createContent(String name) + { + NodeRef contentNodeRef = fileFolderService.create(folderNodeRef, name+".txt", ContentModel.TYPE_CONTENT).getNodeRef(); + // add some content + ContentWriter contentWriter = fileFolderService.getWriter(contentNodeRef); + contentWriter.putContent(name); + // done + return contentNodeRef; + } + } diff --git a/source/java/org/alfresco/repo/node/db/DbNodeServiceImpl.java b/source/java/org/alfresco/repo/node/db/DbNodeServiceImpl.java index bf4e9cb2f7..a536ee885a 100644 --- a/source/java/org/alfresco/repo/node/db/DbNodeServiceImpl.java +++ b/source/java/org/alfresco/repo/node/db/DbNodeServiceImpl.java @@ -2437,6 +2437,11 @@ public class DbNodeServiceImpl extends AbstractNodeServiceImpl propagateTimeStamps(oldParentAssocRef); propagateTimeStamps(newParentAssocRef); } + else + { + // Propagate timestamps for rename case, see ALF-10884 + propagateTimeStamps(newParentAssocRef); + } invokeOnCreateChildAssociation(newParentAssocRef, false); invokeOnDeleteChildAssociation(oldParentAssocRef); diff --git a/source/java/org/alfresco/repo/search/impl/lucene/ADMLuceneSearcherImpl.java b/source/java/org/alfresco/repo/search/impl/lucene/ADMLuceneSearcherImpl.java index 64fc33ef8b..5eb003e7c0 100644 --- a/source/java/org/alfresco/repo/search/impl/lucene/ADMLuceneSearcherImpl.java +++ b/source/java/org/alfresco/repo/search/impl/lucene/ADMLuceneSearcherImpl.java @@ -284,25 +284,7 @@ public class ADMLuceneSearcherImpl extends AbstractLuceneBase implements LuceneS } } - - Locale getLocale(SearchParameters searchParameters) - { - List locales = searchParameters.getLocales(); - if (((locales == null) || (locales.size() == 0))) - { - locales = Collections.singletonList(I18NUtil.getLocale()); - } - - if (locales.size() > 1) - { - throw new SearcherException("Order on text/mltext properties with more than one locale is not curently supported"); - } - - Locale sortLocale = locales.get(0); - return sortLocale; - } - - String findSortField(SearchParameters searchParameters, ClosingIndexSearcher searcher, String field, Locale sortLocale) + protected String findSortField(SearchParameters searchParameters, ClosingIndexSearcher searcher, String field, Locale sortLocale) { // find best field match diff --git a/source/java/org/alfresco/repo/search/impl/lucene/ADMLuceneTest.java b/source/java/org/alfresco/repo/search/impl/lucene/ADMLuceneTest.java index c940b59ea8..8978b6e41a 100644 --- a/source/java/org/alfresco/repo/search/impl/lucene/ADMLuceneTest.java +++ b/source/java/org/alfresco/repo/search/impl/lucene/ADMLuceneTest.java @@ -25,6 +25,7 @@ import java.io.InputStream; import java.io.ObjectInputStream; import java.io.ObjectOutputStream; import java.io.Serializable; +import java.text.Collator; import java.text.SimpleDateFormat; import java.util.ArrayList; import java.util.Arrays; @@ -485,7 +486,8 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener // note: cm:thumbnail - hence auditable aspect will be applied with mandatory properties (cm:created, cm:modified, cm:creator, cm:modifier) n15 = nodeService.createNode(n13, ASSOC_TYPE_QNAME, QName.createQName("{namespace}fifteen"), ContentModel.TYPE_THUMBNAIL, getOrderProperties()).getChildRef(); - + nodeService.setProperty(n15, ContentModel.PROP_CONTENT, new ContentData(null, "text/richtext", 0L, "UTF-8", Locale.FRENCH)); + ContentWriter writer = contentService.getWriter(n14, ContentModel.PROP_CONTENT, true); writer.setEncoding("UTF-8"); // InputStream is = @@ -495,6 +497,13 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener + " as at be but by for if in into is it no not of on or such that the their then there these they this to was will with: " + " and random charcters \u00E0\u00EA\u00EE\u00F0\u00F1\u00F6\u00FB\u00FF"); // System.out.println("Size is " + writer.getSize()); + + writer = contentService.getWriter(n15, ContentModel.PROP_CONTENT, true); + writer.setEncoding("UTF-8"); + // InputStream is = + // this.getClass().getClassLoader().getResourceAsStream("test.doc"); + // writer.putContent(is); + writer.putContent(" "); nodeService.addChild(rootNodeRef, n8, ContentModel.ASSOC_CHILDREN, QName.createQName("{namespace}eight-0")); nodeService.addChild(n1, n8, ASSOC_TYPE_QNAME, QName.createQName("{namespace}eight-1")); @@ -1762,7 +1771,9 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener ftsQueryWithCount(searcher, "PATH", "\"//.\"", 16); ftsQueryWithCount(searcher, "cm:content:brown", 1); ftsQueryWithCount(searcher, "ANDY:brown", 1); + ftsQueryWithCount(searcher, "andy:brown", 1); ftsQueryWithCount(searcher, "ANDY", "brown", 1); + ftsQueryWithCount(searcher, "andy", "brown", 1); // test date ranges - note: expected 2 results = n14 (cm:content) and n15 (cm:thumbnail) ftsQueryWithCount(searcher, "modified:*", 2, Arrays.asList(new NodeRef[]{n14,n15})); @@ -3079,6 +3090,8 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener */ public void testSort() throws Exception { + Collator collator = Collator.getInstance(I18NUtil.getLocale()); + luceneFTS.pause(); buildBaseIndex(); runBaseTests(); @@ -3099,7 +3112,7 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener if (current != null) { - if (current.compareTo(id) > 0) + if (collator.compare(current, id) > 0) { fail(); } @@ -3121,7 +3134,7 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener String id = row.getNodeRef().getId(); if (current != null) { - if (current.compareTo(id) < 0) + if (collator.compare(current, id) < 0) { fail(); } @@ -3476,7 +3489,7 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener // System.out.println( (currentBun == null ? "null" : NumericEncoder.encode(currentBun))+ " "+currentBun); if ((text != null) && (currentBun != null)) { - assertTrue(text.compareTo(currentBun) <= 0); + assertTrue(collator.compare(text, currentBun) <= 0); } text = currentBun; } @@ -3496,7 +3509,7 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener // System.out.println(currentBun); if ((text != null) && (currentBun != null)) { - assertTrue(text.compareTo(currentBun) >= 0); + assertTrue(collator.compare(text, currentBun) >= 0); } text = currentBun; } @@ -3512,7 +3525,8 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener Locale[] testLocales = new Locale[] { I18NUtil.getLocale(), Locale.ENGLISH, Locale.FRENCH }; for (Locale testLocale : testLocales) { - + Collator localisedCollator = Collator.getInstance(testLocale); + SearchParameters sp19 = new SearchParameters(); sp19.addStore(rootNodeRef.getStoreRef()); sp19.setLanguage(SearchService.LANGUAGE_LUCENE); @@ -3532,7 +3546,7 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener // "+currentBun); if ((text != null) && (currentBun != null)) { - assertTrue(text.compareTo(currentBun) <= 0); + assertTrue(localisedCollator.compare(text, currentBun) <= 0); } text = currentBun; } @@ -3556,7 +3570,7 @@ public class ADMLuceneTest extends TestCase implements DictionaryListener String currentBun = mltext.getValue(testLocale); if ((text != null) && (currentBun != null)) { - assertTrue(text.compareTo(currentBun) >= 0); + assertTrue(localisedCollator.compare(text, currentBun) >= 0); } text = currentBun; } diff --git a/source/java/org/alfresco/repo/search/impl/lucene/AbstractLuceneIndexerAndSearcherFactory.java b/source/java/org/alfresco/repo/search/impl/lucene/AbstractLuceneIndexerAndSearcherFactory.java index f9fd74c941..32e1b7bc1e 100644 --- a/source/java/org/alfresco/repo/search/impl/lucene/AbstractLuceneIndexerAndSearcherFactory.java +++ b/source/java/org/alfresco/repo/search/impl/lucene/AbstractLuceneIndexerAndSearcherFactory.java @@ -201,6 +201,10 @@ public abstract class AbstractLuceneIndexerAndSearcherFactory extends AbstractIn private boolean contentIndexingEnabled = true; + private boolean useInMemorySort = true; + + private int maxRawResultSetSizeForInMemorySort = 1000; + /** * Private constructor for the singleton TODO: FIt in with IOC */ @@ -1041,6 +1045,38 @@ public abstract class AbstractLuceneIndexerAndSearcherFactory extends AbstractIn this.threadPoolExecutor = threadPoolExecutor; } + /** + * @return the useInMemorySort + */ + public boolean getUseInMemorySort() + { + return useInMemorySort; + } + + /** + * @param useInMemorySort the useInMemorySort to set + */ + public void setUseInMemorySort(boolean useInMemorySort) + { + this.useInMemorySort = useInMemorySort; + } + + /** + * @return the maxRawResultSetSizeForInMemorySort + */ + public int getMaxRawResultSetSizeForInMemorySort() + { + return maxRawResultSetSizeForInMemorySort; + } + + /** + * @param maxRawResultSetSizeForInMemorySort the maxRawResultSetSizeForInMemorySort to set + */ + public void setMaxRawResultSetSizeForInMemorySort(int maxRawResultSetSizeForInMemorySort) + { + this.maxRawResultSetSizeForInMemorySort = maxRawResultSetSizeForInMemorySort; + } + /** * This component is able to safely perform backups of the Lucene indexes while the server is running. *

diff --git a/source/java/org/alfresco/repo/search/impl/lucene/LuceneAlfrescoFtsQueryLanguage.java b/source/java/org/alfresco/repo/search/impl/lucene/LuceneAlfrescoFtsQueryLanguage.java index b8c2888db0..879d79cc5f 100644 --- a/source/java/org/alfresco/repo/search/impl/lucene/LuceneAlfrescoFtsQueryLanguage.java +++ b/source/java/org/alfresco/repo/search/impl/lucene/LuceneAlfrescoFtsQueryLanguage.java @@ -80,6 +80,8 @@ public class LuceneAlfrescoFtsQueryLanguage extends AbstractLuceneQueryLanguage searchParameters.getNamespace()); QueryOptions options = QueryOptions.create(searchParameters); + options.setUseInMemorySort(searchParameters.getUseInMemorySort()); + options.setMaxRawResultSetSizeForInMemorySort(searchParameters.getMaxRawResultSetSizeForInMemorySort()); FTSParser.Mode mode; diff --git a/source/java/org/alfresco/repo/search/impl/lucene/LuceneAlfrescoLuceneQueryLanguage.java b/source/java/org/alfresco/repo/search/impl/lucene/LuceneAlfrescoLuceneQueryLanguage.java index d0088b7504..2ab15de706 100644 --- a/source/java/org/alfresco/repo/search/impl/lucene/LuceneAlfrescoLuceneQueryLanguage.java +++ b/source/java/org/alfresco/repo/search/impl/lucene/LuceneAlfrescoLuceneQueryLanguage.java @@ -19,42 +19,18 @@ package org.alfresco.repo.search.impl.lucene; import java.io.IOException; -import java.util.ArrayList; -import java.util.LinkedHashMap; -import java.util.List; import java.util.Locale; -import java.util.Map; import org.alfresco.repo.dictionary.IndexTokenisationMode; import org.alfresco.repo.search.EmptyResultSet; import org.alfresco.repo.search.SearcherException; import org.alfresco.repo.search.impl.lucene.analysis.DateTimeAnalyser; -import org.alfresco.repo.search.impl.parsers.AlfrescoFunctionEvaluationContext; -import org.alfresco.repo.search.impl.parsers.FTSParser; -import org.alfresco.repo.search.impl.parsers.FTSQueryParser; -import org.alfresco.repo.search.impl.querymodel.Argument; -import org.alfresco.repo.search.impl.querymodel.Column; -import org.alfresco.repo.search.impl.querymodel.Constraint; -import org.alfresco.repo.search.impl.querymodel.Function; -import org.alfresco.repo.search.impl.querymodel.Order; -import org.alfresco.repo.search.impl.querymodel.Ordering; -import org.alfresco.repo.search.impl.querymodel.QueryEngine; -import org.alfresco.repo.search.impl.querymodel.QueryEngineResults; -import org.alfresco.repo.search.impl.querymodel.QueryModelFactory; -import org.alfresco.repo.search.impl.querymodel.QueryOptions; -import org.alfresco.repo.search.impl.querymodel.QueryOptions.Connective; -import org.alfresco.repo.search.impl.querymodel.impl.functions.PropertyAccessor; -import org.alfresco.repo.search.impl.querymodel.impl.functions.Score; -import org.alfresco.repo.search.impl.querymodel.impl.lucene.LuceneOrdering; import org.alfresco.repo.search.results.SortedResultSet; import org.alfresco.service.cmr.dictionary.DataTypeDefinition; import org.alfresco.service.cmr.dictionary.PropertyDefinition; -import org.alfresco.service.cmr.search.LimitBy; import org.alfresco.service.cmr.search.ResultSet; import org.alfresco.service.cmr.search.SearchParameters; import org.alfresco.service.cmr.search.SearchService; -import org.alfresco.service.cmr.search.SearchParameters.SortDefinition; -import org.alfresco.service.cmr.search.SearchParameters.SortDefinition.SortType; import org.alfresco.service.namespace.QName; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; @@ -111,17 +87,18 @@ public class LuceneAlfrescoLuceneQueryLanguage extends AbstractLuceneQueryLangua Hits hits; - boolean requiresPostSort = false; + boolean requiresDateTimePostSort = false; + SortField[] fields = new SortField[searchParameters.getSortDefinitions().size()]; + if (searchParameters.getSortDefinitions().size() > 0) { int index = 0; - SortField[] fields = new SortField[searchParameters.getSortDefinitions().size()]; for (SearchParameters.SortDefinition sd : searchParameters.getSortDefinitions()) { switch (sd.getSortType()) { case FIELD: - Locale sortLocale = admLuceneSearcher.getLocale(searchParameters); + Locale sortLocale = searchParameters.getSortLocale(); String field = sd.getField(); if (field.startsWith("@")) { @@ -199,7 +176,7 @@ public class LuceneAlfrescoLuceneQueryLanguage extends AbstractLuceneQueryLangua switch (propertyDef.getIndexTokenisationMode()) { case TRUE: - requiresPostSort = true; + requiresDateTimePostSort = true; break; case BOTH: field = field + ".sort"; @@ -211,7 +188,7 @@ public class LuceneAlfrescoLuceneQueryLanguage extends AbstractLuceneQueryLangua } else { - requiresPostSort = true; + requiresDateTimePostSort = true; } } } @@ -230,30 +207,39 @@ public class LuceneAlfrescoLuceneQueryLanguage extends AbstractLuceneQueryLangua fields[index++] = new SortField(null, SortField.DOC, !sd.isAscending()); break; case SCORE: - fields[index++] = new SortField(null, SortField.SCORE, !sd.isAscending()); + // Score is naturally high to low -ie desc + fields[index++] = new SortField(null, SortField.SCORE, sd.isAscending()); break; } } - hits = searcher.search(query, new Sort(fields)); + } + + hits = searcher.search(query); + + boolean postSort = false;; + if(fields.length > 0) + { + postSort = searchParameters.usePostSort(hits.length(), admLuceneSearcher.getLuceneConfig().getUseInMemorySort(), admLuceneSearcher.getLuceneConfig().getMaxRawResultSetSizeForInMemorySort()); + if(postSort == false) + { + hits = searcher.search(query, new Sort(fields)); + } + } + ResultSet answer; + ResultSet result = new LuceneResultSet(hits, searcher, admLuceneSearcher.getNodeService(), admLuceneSearcher.getTenantService(), searchParameters, admLuceneSearcher.getLuceneConfig()); + if(postSort || (admLuceneSearcher.getLuceneConfig().getPostSortDateTime() && requiresDateTimePostSort)) + { + ResultSet sorted = new SortedResultSet(result, admLuceneSearcher.getNodeService(), searchParameters.getSortDefinitions(), admLuceneSearcher.getNamespacePrefixResolver(), admLuceneSearcher.getDictionaryService(), searchParameters.getSortLocale()); + answer = sorted; } else { - hits = searcher.search(query); - } - - ResultSet rs = new LuceneResultSet(hits, searcher, admLuceneSearcher.getNodeService(), admLuceneSearcher.getTenantService(), searchParameters, admLuceneSearcher.getLuceneConfig()); - rs = new PagingLuceneResultSet(rs, searchParameters, admLuceneSearcher.getNodeService()); - if (admLuceneSearcher.getLuceneConfig().getPostSortDateTime() && requiresPostSort) - { - ResultSet sorted = new SortedResultSet(rs, admLuceneSearcher.getNodeService(), searchParameters, admLuceneSearcher.getNamespacePrefixResolver()); - return sorted; - } - else - { - return rs; + answer = result; } + ResultSet rs = new PagingLuceneResultSet(answer, searchParameters, admLuceneSearcher.getNodeService()); + return rs; } catch (ParseException e) { diff --git a/source/java/org/alfresco/repo/search/impl/lucene/LuceneConfig.java b/source/java/org/alfresco/repo/search/impl/lucene/LuceneConfig.java index 31ef9a3c27..9e835131fe 100644 --- a/source/java/org/alfresco/repo/search/impl/lucene/LuceneConfig.java +++ b/source/java/org/alfresco/repo/search/impl/lucene/LuceneConfig.java @@ -278,6 +278,11 @@ public interface LuceneConfig */ long getMaxTransformationTime(); + /** + * @return + */ + public boolean getUseInMemorySort(); + /** * @param indexerBatchSize */ @@ -423,6 +428,11 @@ public interface LuceneConfig */ void setWriterMergeFactor(int writerMergeFactor); + /** + * @return + */ + public int getMaxRawResultSetSizeForInMemorySort(); + /** * @param writerMaxBufferedDocs */ diff --git a/source/java/org/alfresco/repo/search/impl/lucene/LuceneResultSet.java b/source/java/org/alfresco/repo/search/impl/lucene/LuceneResultSet.java index 9759c2ce0d..8652e6a099 100644 --- a/source/java/org/alfresco/repo/search/impl/lucene/LuceneResultSet.java +++ b/source/java/org/alfresco/repo/search/impl/lucene/LuceneResultSet.java @@ -315,5 +315,21 @@ public class LuceneResultSet extends AbstractResultSet { return bulkFetchSize; } + + /** + * @param index + * @return + */ + public int doc(int index) + { + try + { + return hits.id(index); + } + catch (IOException e) + { + throw new SearcherException(e); + } + } } diff --git a/source/java/org/alfresco/repo/search/impl/lucene/LuceneResultSetRow.java b/source/java/org/alfresco/repo/search/impl/lucene/LuceneResultSetRow.java index d0730d75d1..35e417b65f 100644 --- a/source/java/org/alfresco/repo/search/impl/lucene/LuceneResultSetRow.java +++ b/source/java/org/alfresco/repo/search/impl/lucene/LuceneResultSetRow.java @@ -152,4 +152,8 @@ public class LuceneResultSetRow extends AbstractResultSetRow throw new UnsupportedOperationException(); } + public int doc() + { + return ((LuceneResultSet)getResultSet()).doc(getIndex()); + } } diff --git a/source/java/org/alfresco/repo/search/impl/querymodel/impl/lucene/LuceneQueryEngine.java b/source/java/org/alfresco/repo/search/impl/querymodel/impl/lucene/LuceneQueryEngine.java index f027fc545d..8b0cd3a9a6 100644 --- a/source/java/org/alfresco/repo/search/impl/querymodel/impl/lucene/LuceneQueryEngine.java +++ b/source/java/org/alfresco/repo/search/impl/querymodel/impl/lucene/LuceneQueryEngine.java @@ -37,6 +37,7 @@ import org.alfresco.repo.search.impl.querymodel.QueryEngine; import org.alfresco.repo.search.impl.querymodel.QueryEngineResults; import org.alfresco.repo.search.impl.querymodel.QueryModelFactory; import org.alfresco.repo.search.impl.querymodel.QueryOptions; +import org.alfresco.repo.search.results.SortedResultSet; import org.alfresco.repo.tenant.TenantService; import org.alfresco.service.cmr.dictionary.DictionaryService; import org.alfresco.service.cmr.repository.NodeService; @@ -49,6 +50,7 @@ import org.alfresco.service.namespace.NamespaceService; import org.apache.lucene.queryParser.ParseException; import org.apache.lucene.search.Hits; import org.apache.lucene.search.Sort; +import org.apache.lucene.search.SortField; /** * @author andyh @@ -64,6 +66,10 @@ public class LuceneQueryEngine implements QueryEngine private TenantService tenantService; private NamespaceService namespaceService; + + private boolean useInMemorySort = true; + + private int maxRawResultSetSizeForInMemorySort = 1000; /** * @param dictionaryService @@ -114,6 +120,38 @@ public class LuceneQueryEngine implements QueryEngine { return new LuceneQueryModelFactory(); } + + /** + * @return the useInMemorySort + */ + public boolean isUseInMemorySort() + { + return useInMemorySort; + } + + /** + * @param useInMemorySort the useInMemorySort to set + */ + public void setUseInMemorySort(boolean useInMemorySort) + { + this.useInMemorySort = useInMemorySort; + } + + /** + * @return the maxRawResultSetSizeForInMemorySort + */ + public int getMaxRawResultSetSizeForInMemorySort() + { + return maxRawResultSetSizeForInMemorySort; + } + + /** + * @param maxRawResultSetSizeForInMemorySort the maxRawResultSetSizeForInMemorySort to set + */ + public void setMaxRawResultSetSizeForInMemorySort(int maxRawResultSetSizeForInMemorySort) + { + this.maxRawResultSetSizeForInMemorySort = maxRawResultSetSizeForInMemorySort; + } public QueryEngineResults executeQuery(Query query, QueryOptions options, FunctionEvaluationContext functionContext) { @@ -159,6 +197,8 @@ public class LuceneQueryEngine implements QueryEngine { searchParameters.setLimitBy(LimitBy.UNLIMITED); } + searchParameters.setUseInMemorySort(options.getUseInMemorySort()); + searchParameters.setMaxRawResultSetSizeForInMemorySort(options.getMaxRawResultSetSizeForInMemorySort()); try { @@ -175,23 +215,43 @@ public class LuceneQueryEngine implements QueryEngine LuceneQueryBuilder builder = (LuceneQueryBuilder) query; org.apache.lucene.search.Query luceneQuery = builder.buildQuery(selectorGroup, luceneContext, functionContext); - // System.out.println(luceneQuery); Sort sort = builder.buildSort(selectorGroup, luceneContext, functionContext); - Hits hits; - - if (sort == null) + + Hits hits = searcher.search(luceneQuery); + + boolean postSort = false;; + if(sort != null) { - hits = searcher.search(luceneQuery); + postSort = searchParameters.usePostSort(hits.length(), useInMemorySort, maxRawResultSetSizeForInMemorySort); + if(postSort == false) + { + hits = searcher.search(luceneQuery, sort); + } + } + + ResultSet answer; + ResultSet result = new LuceneResultSet(hits, searcher, nodeService, tenantService, searchParameters, indexAndSearcher); + if(postSort) + { + if(sort != null) + { + for(SortField sf : sort.getSort()) + { + searchParameters.addSort(sf.getField(), !sf.getReverse()); + } + } + + ResultSet sorted = new SortedResultSet(result, nodeService, builder.buildSortDefinitions(selectorGroup, luceneContext, functionContext), namespaceService, dictionaryService, searchParameters.getSortLocale()); + answer = sorted; } else { - hits = searcher.search(luceneQuery, sort); + answer = result; } - - LuceneResultSet result = new LuceneResultSet(hits, searcher, nodeService, tenantService, searchParameters, indexAndSearcher); - ResultSet rs = new PagingLuceneResultSet(result, searchParameters, nodeService); + ResultSet rs = new PagingLuceneResultSet(answer, searchParameters, nodeService); + Map, ResultSet> map = new HashMap, ResultSet>(1); map.put(selectorGroup, rs); return new QueryEngineResults(map); diff --git a/source/java/org/alfresco/repo/search/results/SortedResultSet.java b/source/java/org/alfresco/repo/search/results/SortedResultSet.java index 880e3a819a..d5fa3aa6f5 100644 --- a/source/java/org/alfresco/repo/search/results/SortedResultSet.java +++ b/source/java/org/alfresco/repo/search/results/SortedResultSet.java @@ -19,17 +19,27 @@ package org.alfresco.repo.search.results; import java.io.Serializable; +import java.text.Collator; import java.util.ArrayList; import java.util.Collections; import java.util.Comparator; import java.util.Iterator; import java.util.List; +import java.util.Locale; +import org.alfresco.error.AlfrescoRuntimeException; import org.alfresco.repo.search.SearcherException; +import org.alfresco.repo.search.impl.lucene.LuceneResultSetRow; +import org.alfresco.service.cmr.dictionary.DataTypeDefinition; +import org.alfresco.service.cmr.dictionary.DictionaryService; +import org.alfresco.service.cmr.dictionary.PropertyDefinition; import org.alfresco.service.cmr.repository.ChildAssociationRef; +import org.alfresco.service.cmr.repository.ContentData; import org.alfresco.service.cmr.repository.InvalidNodeRefException; +import org.alfresco.service.cmr.repository.MLText; import org.alfresco.service.cmr.repository.NodeRef; import org.alfresco.service.cmr.repository.NodeService; +import org.alfresco.service.cmr.repository.datatype.DefaultTypeConverter; import org.alfresco.service.cmr.search.ResultSet; import org.alfresco.service.cmr.search.ResultSetMetaData; import org.alfresco.service.cmr.search.ResultSetRow; @@ -42,39 +52,55 @@ import org.alfresco.util.Pair; /** * Sorted results + * * @author andyh - * */ public class SortedResultSet implements ResultSet { - ArrayList nodeRefsAndScores; + private ArrayList nodeRefsAndScores; - NodeService nodeService; + private NodeService nodeService; - SearchParameters searchParameters; + private ResultSet resultSet; - ResultSet resultSet; + private DictionaryService dictionaryService; + + private Locale locale; + + private Collator collator; + + public SortedResultSet(ResultSet resultSet, NodeService nodeService, SearchParameters searchParametersx, NamespacePrefixResolver namespacePrefixResolver, + DictionaryService dictionaryService, Locale locale) + { + this(resultSet, nodeService, searchParametersx.getSortDefinitions(), namespacePrefixResolver, dictionaryService, locale); + } /** * Source and resources required to sort + * * @param resultSet * @param nodeService * @param searchParameters * @param namespacePrefixResolver */ - public SortedResultSet(ResultSet resultSet, NodeService nodeService, SearchParameters searchParameters, NamespacePrefixResolver namespacePrefixResolver) + public SortedResultSet(ResultSet resultSet, NodeService nodeService, List sortDefinitions, NamespacePrefixResolver namespacePrefixResolver, + DictionaryService dictionaryService, Locale locale) { this.nodeService = nodeService; - this.searchParameters = searchParameters; this.resultSet = resultSet; + this.dictionaryService = dictionaryService; + this.locale = locale; + + collator = Collator.getInstance(this.locale); nodeRefsAndScores = new ArrayList(resultSet.length()); for (ResultSetRow row : resultSet) { - nodeRefsAndScores.add(new NodeRefAndScore(row.getNodeRef(), row.getScore())); + LuceneResultSetRow lrow = (LuceneResultSetRow) row; + nodeRefsAndScores.add(new NodeRefAndScore(row.getNodeRef(), row.getScore(), lrow.doc())); } - ArrayList order = new ArrayList(); - for (SortDefinition sd : searchParameters.getSortDefinitions()) + ArrayList order = new ArrayList(); + for (SortDefinition sd : sortDefinitions) { switch (sd.getSortType()) { @@ -82,16 +108,62 @@ public class SortedResultSet implements ResultSet String field = sd.getField(); if (field.startsWith("@")) { + if (field.endsWith(".size")) + { + QName qname = expandAttributeFieldName(field.substring(0, field.length() - 5), namespacePrefixResolver); + if (qname != null) + { + PropertyDefinition propDef = dictionaryService.getProperty(qname); + if ((propDef != null) && propDef.getDataType().getName().equals(DataTypeDefinition.CONTENT)) + { + order.add(new ContentSizeOrder(qname, sd.isAscending(), nodeService)); + } + + } + } + if (field.endsWith(".mimetype")) + { + QName qname = expandAttributeFieldName(field.substring(0, field.length() - 9), namespacePrefixResolver); + if (qname != null) + { + PropertyDefinition propDef = dictionaryService.getProperty(qname); + if ((propDef != null) && propDef.getDataType().getName().equals(DataTypeDefinition.CONTENT)) + { + order.add(new ContentMimetypeOrder(qname, sd.isAscending(), nodeService, collator)); + } + + } + } QName qname = expandAttributeFieldName(field, namespacePrefixResolver); - order.add(new AttributeOrder(qname, sd.isAscending())); + if (qname != null) + { + order.add(new AttributeOrder(qname, sd.isAscending(), nodeService, this.dictionaryService, collator, locale)); + } + } + else + { + if (field.equals("ID")) + { + order.add(new IdOrder(sd.isAscending(), collator)); + } + else if (field.equals("EXACTTYPE")) + { + order.add(new TypeOrder(sd.isAscending(), nodeService, collator)); + } + else if (field.equals("PARENT")) + { + order.add(new ParentIdOrder(sd.isAscending(), nodeService, collator)); + } + else + { + // SKIP UNKNOWN throw new AlfrescoRuntimeException("Property is not orderable: "+field); + } } break; case DOCUMENT: - // ignore - break; + order.add(new DocumentOrder(sd.isAscending())); case SCORE: - // ignore - break; + order.add(new ScoreOrder(sd.isAscending())); } } @@ -166,9 +238,9 @@ public class SortedResultSet implements ResultSet return new SortedResultSetRowIterator(this); } - private void orderNodes(List answer, List order) + private void orderNodes(List answer, List order) { - Collections.sort(answer, new NodeRefAndScoreComparator(nodeService, order)); + Collections.sort(answer, new NodeRefAndScoreComparator(order)); } private QName expandAttributeFieldName(String field, NamespacePrefixResolver namespacePrefixResolver) @@ -185,8 +257,16 @@ public class SortedResultSet implements ResultSet } else { + String prefix = field.substring(1, colonPosition); + + String uri = namespacePrefixResolver.getNamespaceURI(prefix); + if (uri == null) + { + return null; + } + // find the prefix - qname = QName.createQName(field.substring(1, colonPosition), field.substring(colonPosition + 1), namespacePrefixResolver); + qname = QName.createQName(prefix, field.substring(colonPosition + 1), namespacePrefixResolver); } } else @@ -198,86 +278,576 @@ public class SortedResultSet implements ResultSet static class NodeRefAndScoreComparator implements Comparator { - List order; + private List order; - NodeService nodeService; - - NodeRefAndScoreComparator(NodeService nodeService, List order) + NodeRefAndScoreComparator(List order) { - this.nodeService = nodeService; this.order = order; } - @SuppressWarnings("unchecked") public int compare(NodeRefAndScore n1, NodeRefAndScore n2) { // Treat missing nodes as null for comparison - for (AttributeOrder attributeOrder : order) + for (OrderDefinition orderDefinition : order) { - Serializable o1; - try + int answer = orderDefinition.compare(n1, n2); + if (answer != 0) { - o1 = nodeService.getProperty(n1.nodeRef, attributeOrder.attribute); - } - catch(InvalidNodeRefException inre) - { - o1 = null; - } - Serializable o2; - try - { - o2 = nodeService.getProperty(n2.nodeRef, attributeOrder.attribute); - } - catch(InvalidNodeRefException inre) - { - o2 = null; - } - - if (o1 == null) - { - if (o2 == null) - { - continue; - } - else - { - return attributeOrder.ascending ? -1 : 1; - } + return answer; } else { - if (o2 == null) - { - return attributeOrder.ascending ? 1 : -1; - } - else - { - if ((o1 instanceof Comparable) && (o2 instanceof Comparable)) - { - return (attributeOrder.ascending ? 1 : -1) * ((Comparable) o1).compareTo((Comparable) o2); - } - else - { - continue; - } - } + continue; } - } return 0; } } - private static class AttributeOrder + private static interface OrderDefinition + { + int compare(NodeRefAndScore n1, NodeRefAndScore n2); + } + + private static class AttributeOrder implements OrderDefinition { QName attribute; boolean ascending; - AttributeOrder(QName attribute, boolean ascending) + NodeService nodeService; + + DictionaryService dictionaryService; + + Collator collator; + + Locale locale; + + AttributeOrder(QName attribute, boolean ascending, NodeService nodeService, DictionaryService dictionaryService, Collator collator, Locale locale) { this.attribute = attribute; this.ascending = ascending; + this.nodeService = nodeService; + this.dictionaryService = dictionaryService; + this.collator = collator; + this.locale = locale; + } + + public int compare(NodeRefAndScore n1, NodeRefAndScore n2) + { + // Treat missing nodes as null for comparison + + Serializable o1; + try + { + o1 = nodeService.getProperty(n1.nodeRef, attribute); + } + catch (InvalidNodeRefException inre) + { + o1 = null; + } + Serializable o2; + try + { + o2 = nodeService.getProperty(n2.nodeRef, attribute); + } + catch (InvalidNodeRefException inre) + { + o2 = null; + } + + if (o1 == null) + { + if (o2 == null) + { + return 0; + } + else + { + return ascending ? -1 : 1; + } + } + else + { + if (o2 == null) + { + return ascending ? 1 : -1; + } + else + { + PropertyDefinition propertyDefinition = dictionaryService.getProperty(attribute); + if (propertyDefinition != null) + { + DataTypeDefinition dataType = propertyDefinition.getDataType(); + if (dataType.getName().equals(DataTypeDefinition.TEXT)) + { + String s1 = DefaultTypeConverter.INSTANCE.convert(String.class, o1); + String s2 = DefaultTypeConverter.INSTANCE.convert(String.class, o2); + int answer = (ascending ? 1 : -1) * collator.compare(s1, s2); + return answer; + } + else if (dataType.getName().equals(DataTypeDefinition.MLTEXT)) + { + String s1 = DefaultTypeConverter.INSTANCE.convert(MLText.class, o1).getValue(locale); + String s2 = DefaultTypeConverter.INSTANCE.convert(MLText.class, o2).getValue(locale); + + if (s1 == null) + { + if (s2 == null) + { + return 0; + } + else + { + return ascending ? -1 : 1; + } + } + else + { + if (s2 == null) + { + return ascending ? 1 : -1; + } + else + { + int answer = (ascending ? 1 : -1) * collator.compare(s1, s2); + return answer; + } + } + } + else + { + if ((o1 instanceof Comparable) && (o2 instanceof Comparable)) + { + int answer = (ascending ? 1 : -1) * ((Comparable) o1).compareTo((Comparable) o2); + return answer; + + } + else + { + return 0; + } + } + } + else + { + if ((o1 instanceof Comparable) && (o2 instanceof Comparable)) + { + int answer = (ascending ? 1 : -1) * ((Comparable) o1).compareTo((Comparable) o2); + return answer; + + } + else + { + return 0; + } + } + } + } + + } + } + + private static class ContentSizeOrder implements OrderDefinition + { + QName attribute; + + boolean ascending; + + NodeService nodeService; + + ContentSizeOrder(QName attribute, boolean ascending, NodeService nodeService) + { + this.attribute = attribute; + this.ascending = ascending; + this.nodeService = nodeService; + + } + + public int compare(NodeRefAndScore n1, NodeRefAndScore n2) + { + // Treat missing nodes as null for comparison + + Serializable o1; + try + { + o1 = nodeService.getProperty(n1.nodeRef, attribute); + } + catch (InvalidNodeRefException inre) + { + o1 = null; + } + Serializable o2; + try + { + o2 = nodeService.getProperty(n2.nodeRef, attribute); + } + catch (InvalidNodeRefException inre) + { + o2 = null; + } + + if (o1 == null) + { + if (o2 == null) + { + return 0; + } + else + { + return ascending ? -1 : 1; + } + } + else + { + if (o2 == null) + { + return ascending ? 1 : -1; + } + else + { + + ContentData cd1 = DefaultTypeConverter.INSTANCE.convert(ContentData.class, o1); + ContentData cd2 = DefaultTypeConverter.INSTANCE.convert(ContentData.class, o2); + + if (cd1 == null) + { + if (cd2 == null) + { + return 0; + } + else + { + return ascending ? -1 : 1; + } + } + else + { + if (cd2 == null) + { + return ascending ? 1 : -1; + } + else + { + return (ascending ? 1 : -1) * (int)(cd1.getSize() - cd2.getSize()); + } + } + } + } + + } + } + + private static class ContentMimetypeOrder implements OrderDefinition + { + QName attribute; + + boolean ascending; + + NodeService nodeService; + + Collator collator; + + ContentMimetypeOrder(QName attribute, boolean ascending, NodeService nodeService, Collator collator) + { + this.attribute = attribute; + this.ascending = ascending; + this.nodeService = nodeService; + this.collator = collator; + } + + public int compare(NodeRefAndScore n1, NodeRefAndScore n2) + { + // Treat missing nodes as null for comparison + + Serializable o1; + try + { + o1 = nodeService.getProperty(n1.nodeRef, attribute); + } + catch (InvalidNodeRefException inre) + { + o1 = null; + } + Serializable o2; + try + { + o2 = nodeService.getProperty(n2.nodeRef, attribute); + } + catch (InvalidNodeRefException inre) + { + o2 = null; + } + + if (o1 == null) + { + if (o2 == null) + { + return 0; + } + else + { + return ascending ? -1 : 1; + } + } + else + { + if (o2 == null) + { + return ascending ? 1 : -1; + } + else + { + + ContentData cd1 = DefaultTypeConverter.INSTANCE.convert(ContentData.class, o1); + ContentData cd2 = DefaultTypeConverter.INSTANCE.convert(ContentData.class, o2); + + if (cd1 == null) + { + if (cd2 == null) + { + return 0; + } + else + { + return ascending ? -1 : 1; + } + } + else + { + if (cd2 == null) + { + return ascending ? 1 : -1; + } + else + { + return (ascending ? 1 : -1) * collator.compare(cd1.getMimetype(), cd2.getMimetype()); + } + } + } + } + + } + } + + private static class IdOrder implements OrderDefinition + { + boolean ascending; + + Collator collator; + + IdOrder(boolean ascending, Collator collator) + { + this.ascending = ascending; + this.collator = collator; + } + + public int compare(NodeRefAndScore n1, NodeRefAndScore n2) + { + // Treat missing nodes as null for comparison + + String o1 = n1.nodeRef.toString(); + String o2 = n2.nodeRef.toString(); + + if (o1 == null) + { + if (o2 == null) + { + return 0; + } + else + { + return ascending ? -1 : 1; + } + } + else + { + if (o2 == null) + { + return ascending ? 1 : -1; + } + else + { + + int answer = (ascending ? 1 : -1) * collator.compare(o1, o2); + return answer; + } + } + } + } + + private static class ScoreOrder implements OrderDefinition + { + boolean ascending; + + ScoreOrder(boolean ascending) + { + this.ascending = ascending; + } + + public int compare(NodeRefAndScore n1, NodeRefAndScore n2) + { + // Treat missing nodes as null for comparison + return (ascending ? 1 : -1) * Float.compare(n1.score, n2.score); + + } + } + + private static class DocumentOrder implements OrderDefinition + { + boolean ascending; + + DocumentOrder(boolean ascending) + { + this.ascending = ascending; + } + + public int compare(NodeRefAndScore n1, NodeRefAndScore n2) + { + // Treat missing nodes as null for comparison + return (ascending ? 1 : -1) * Float.compare(n1.doc, n2.doc); + + } + } + + private static class TypeOrder implements OrderDefinition + { + boolean ascending; + + NodeService nodeService; + + Collator collator; + + TypeOrder(boolean ascending, NodeService nodeService, Collator collator) + { + this.ascending = ascending; + this.nodeService = nodeService; + this.collator = collator; + } + + public int compare(NodeRefAndScore n1, NodeRefAndScore n2) + { + // Treat missing nodes as null for comparison + + String o1; + try + { + o1 = nodeService.getType(n1.nodeRef).toString(); + } + catch (InvalidNodeRefException inre) + { + o1 = null; + } + String o2; + try + { + o2 = nodeService.getType(n2.nodeRef).toString(); + } + catch (InvalidNodeRefException inre) + { + o2 = null; + } + + if (o1 == null) + { + if (o2 == null) + { + return 0; + } + else + { + return ascending ? -1 : 1; + } + } + else + { + if (o2 == null) + { + return ascending ? 1 : -1; + } + else + { + + int answer = (ascending ? 1 : -1) * collator.compare(o1, o2); + return answer; + } + } + + } + } + + private static class ParentIdOrder implements OrderDefinition + { + boolean ascending; + + NodeService nodeService; + + Collator collator; + + ParentIdOrder(boolean ascending, NodeService nodeService, Collator collator) + { + this.ascending = ascending; + this.nodeService = nodeService; + this.collator = collator; + } + + public int compare(NodeRefAndScore n1, NodeRefAndScore n2) + { + // Treat missing nodes as null for comparison + + String o1 = null; + ; + try + { + ChildAssociationRef ca1 = nodeService.getPrimaryParent(n1.nodeRef); + if ((ca1 != null) && (ca1.getParentRef() != null)) + { + o1 = ca1.getParentRef().toString(); + } + } + catch (InvalidNodeRefException inre) + { + o1 = null; + } + String o2 = null; + try + { + ChildAssociationRef ca2 = nodeService.getPrimaryParent(n2.nodeRef); + if ((ca2 != null) && (ca2.getParentRef() != null)) + { + o2 = ca2.getParentRef().toString(); + } + } + catch (InvalidNodeRefException inre) + { + o2 = null; + } + + if (o1 == null) + { + if (o2 == null) + { + return 0; + } + else + { + return ascending ? -1 : 1; + } + } + else + { + if (o2 == null) + { + return ascending ? 1 : -1; + } + else + { + + int answer = (ascending ? 1 : -1) * collator.compare(o1, o2); + return answer; + } + } + } } @@ -287,10 +857,13 @@ public class SortedResultSet implements ResultSet float score; - NodeRefAndScore(NodeRef nodeRef, float score) + int doc; + + NodeRefAndScore(NodeRef nodeRef, float score, int doc) { this.nodeRef = nodeRef; this.score = score; + this.doc = doc; } } @@ -304,7 +877,7 @@ public class SortedResultSet implements ResultSet { throw new UnsupportedOperationException(); } - + /** * Bulk fetch results in the cache * @@ -312,7 +885,7 @@ public class SortedResultSet implements ResultSet */ public boolean setBulkFetch(boolean bulkFetch) { - return resultSet.setBulkFetch(bulkFetch); + return resultSet.setBulkFetch(bulkFetch); } /** @@ -332,7 +905,7 @@ public class SortedResultSet implements ResultSet */ public int setBulkFetchSize(int bulkFetchSize) { - return resultSet.setBulkFetchSize(bulkFetchSize); + return resultSet.setBulkFetchSize(bulkFetchSize); } /** diff --git a/source/java/org/alfresco/repo/security/authority/AuthorityDAOImpl.java b/source/java/org/alfresco/repo/security/authority/AuthorityDAOImpl.java index dff690f4f4..08c246f524 100644 --- a/source/java/org/alfresco/repo/security/authority/AuthorityDAOImpl.java +++ b/source/java/org/alfresco/repo/security/authority/AuthorityDAOImpl.java @@ -962,8 +962,16 @@ public class AuthorityDAOImpl implements AuthorityDAO, NodeServicePolicies.Befor } else { - List cars = nodeService.getChildAssocs(nodeRef, RegexQNamePattern.MATCH_ALL, - RegexQNamePattern.MATCH_ALL, false); + List cars = childAuthorityCache.get(nodeRef); + if (cars == null) + { + cars = nodeService.getChildAssocs(nodeRef, RegexQNamePattern.MATCH_ALL, + RegexQNamePattern.MATCH_ALL, false); + if (!cars.isEmpty() && cars.get(0).getTypeQName().equals(ContentModel.ASSOC_MEMBER)) + { + childAuthorityCache.put(nodeRef, cars); + } + } // Take advantage of the fact that the authority name is on the child association for (ChildAssociationRef car : cars) diff --git a/source/java/org/alfresco/repo/version/Version2ServiceImpl.java b/source/java/org/alfresco/repo/version/Version2ServiceImpl.java index ba519e4bab..3c72676ce7 100644 --- a/source/java/org/alfresco/repo/version/Version2ServiceImpl.java +++ b/source/java/org/alfresco/repo/version/Version2ServiceImpl.java @@ -658,6 +658,10 @@ public class Version2ServiceImpl extends VersionServiceImpl implements VersionSe // Freeze the details of the aspect dbNodeService.addAspect(versionNodeRef, aspect, nodeDetails.getProperties(aspect)); } + + // ALF-9638: Freeze the aspect specific associations + freezeChildAssociations(versionNodeRef, nodeDetails.getChildAssociations(aspect)); + freezeAssociations(versionNodeRef, nodeDetails.getAssociations(aspect)); } } diff --git a/source/java/org/alfresco/repo/workflow/WorkflowInterpreter.java b/source/java/org/alfresco/repo/workflow/WorkflowInterpreter.java index 3a970ae7b0..b1dd512e28 100644 --- a/source/java/org/alfresco/repo/workflow/WorkflowInterpreter.java +++ b/source/java/org/alfresco/repo/workflow/WorkflowInterpreter.java @@ -237,8 +237,7 @@ public class WorkflowInterpreter extends BaseInterpreter * @return The textual output of the command. */ @Override - protected String executeCommand(String line) - throws IOException + protected String executeCommand(String line) throws IOException { String[] command = line.split(" "); if (command.length == 0) @@ -956,7 +955,7 @@ public class WorkflowInterpreter extends BaseInterpreter { return "Syntax Error.\n"; } - WorkflowPath path = workflowService.signal(command[1], (command.length == 3) ? command[2] : null); + WorkflowPath path = workflowService.signal(command[1], getTransition(command)); out.println("signal sent - path id: " + path.getId()); out.print(interpretCommand("show transitions")); } @@ -1260,6 +1259,23 @@ public class WorkflowInterpreter extends BaseInterpreter out.close(); return retVal; } + + private String getTransition(String[] command) + { + int length = command.length; + if(length <3) + { + return null; + } + // Transition name may contain spaces + StringBuilder builder = new StringBuilder(command[2]); + int i = 3; + while(i cmrPooledTasks = this.services.getWorkflowService().getPooledTasks( - authority); - ArrayList pooledTasks = new ArrayList(); - for (WorkflowTask cmrPooledTask : cmrPooledTasks) - { - pooledTasks.add(new JscriptWorkflowTask(cmrPooledTask, this.services, this.getScope())); - } - - Scriptable pooledTasksScriptable = (Scriptable)new ValueConverter().convertValueForScript( - this.services, getScope(), null, pooledTasks); - return pooledTasksScriptable; - } - - /** - * Get task by id - * - * @param id task id - * @return the task (null if not found) - */ - public JscriptWorkflowTask getTask(String id) - { - WorkflowTask cmrWorkflowTask = this.services.getWorkflowService().getTaskById(id); - return new JscriptWorkflowTask(cmrWorkflowTask, this.services, this.getScope()); - } - - /** - * Get task by id. Alternative method signature to getTask(String id) for - * those used to the Template API - * - * @param id task id - * @return the task (null if not found) - */ - public JscriptWorkflowTask getTaskById(String id) - { - return getTask(id); - } - - /** - * Gets the latest versions of the deployed, workflow definitions - * - * @return the latest versions of the deployed workflow definitions - */ - public Scriptable getLatestDefinitions() - { - List cmrDefinitions = this.services.getWorkflowService().getDefinitions(); - ArrayList workflowDefs = new ArrayList(); - for (WorkflowDefinition cmrDefinition : cmrDefinitions) - { - workflowDefs.add(new JscriptWorkflowDefinition(cmrDefinition, this.services, getScope())); - } - - Scriptable workflowDefsScriptable = (Scriptable)new ValueConverter().convertValueForScript( - this.services, this.getScope(), null, workflowDefs); - return workflowDefsScriptable; - } + */ + public Scriptable getPooledTasks(final String authority) + { + List cmrPooledTasks = services.getWorkflowService().getPooledTasks(authority); + ArrayList pooledTasks = new ArrayList(); + for (WorkflowTask cmrPooledTask : cmrPooledTasks) + { + pooledTasks.add(new JscriptWorkflowTask(cmrPooledTask, services, getScope())); + } + ValueConverter converter = new ValueConverter(); + return (Scriptable)converter.convertValueForScript(services, getScope(), null, pooledTasks); + } + + /** + * Get task by id + * + * @param id task id + * @return the task (null if not found) + */ + public JscriptWorkflowTask getTask(String id) + { + WorkflowTask task = services.getWorkflowService().getTaskById(id); + return task == null ? null : new JscriptWorkflowTask(task, services, this.getScope()); + } + + /** + * Get task by id. Alternative method signature to getTask(String id) for + * those used to the Template API + * + * @param id task id + * @return the task (null if not found) + */ + public JscriptWorkflowTask getTaskById(String id) + { + return getTask(id); + } + + /** + * Gets the latest versions of the deployed, workflow definitions + * + * @return the latest versions of the deployed workflow definitions + */ + public Scriptable getLatestDefinitions() + { + List cmrDefinitions = services.getWorkflowService().getDefinitions(); + ArrayList workflowDefs = new ArrayList(); + for (WorkflowDefinition cmrDefinition : cmrDefinitions) + { + workflowDefs.add(new JscriptWorkflowDefinition(cmrDefinition, services, getScope())); + } + + return (Scriptable)new ValueConverter().convertValueForScript(services, getScope(), null, workflowDefs); + } - /** - * Gets all versions of the deployed workflow definitions - * - * @return all versions of the deployed workflow definitions - */ - public Scriptable getAllDefinitions() - { - List cmrDefinitions = this.services.getWorkflowService().getAllDefinitions(); - ArrayList workflowDefs = new ArrayList(); - for (WorkflowDefinition cmrDefinition : cmrDefinitions) - { - workflowDefs.add(new JscriptWorkflowDefinition(cmrDefinition, this.services, getScope())); - } - - Scriptable workflowDefsScriptable = (Scriptable)new ValueConverter().convertValueForScript( - this.services, this.getScope(), null, workflowDefs); - return workflowDefsScriptable; - } - - /** - * Create a workflow package (a container of content to route through a workflow) - * - * @return the created workflow package - */ - public ScriptNode createPackage() - { - NodeRef node = this.services.getWorkflowService().createPackage(null); - return new ScriptNode(node, services); - } + /** + * Gets all versions of the deployed workflow definitions + * + * @return all versions of the deployed workflow definitions + */ + public Scriptable getAllDefinitions() + { + List cmrDefinitions = services.getWorkflowService().getAllDefinitions(); + ArrayList workflowDefs = new ArrayList(); + for (WorkflowDefinition cmrDefinition : cmrDefinitions) + { + workflowDefs.add(new JscriptWorkflowDefinition(cmrDefinition, services, getScope())); + } + return (Scriptable)new ValueConverter().convertValueForScript(services, getScope(), null, workflowDefs); + } + + /** + * Create a workflow package (a container of content to route through a workflow) + * + * @return the created workflow package + */ + public ScriptNode createPackage() + { + NodeRef node = services.getWorkflowService().createPackage(null); + return new ScriptNode(node, services); + } - /** - * Get tasks assigned to the current user, filtered by workflow task state. - * Only tasks having the specified state will be returned. - * + /** + * Get tasks assigned to the current user, filtered by workflow task state. + * Only tasks having the specified state will be returned. + * * @param state workflow task state to filter assigned tasks by * @return the list of assigned tasks, filtered by state - */ - private Scriptable getAssignedTasksByState(WorkflowTaskState state) - { - List cmrAssignedTasks = this.services.getWorkflowService().getAssignedTasks( - services.getAuthenticationService().getCurrentUserName(), state); - ArrayList assignedTasks = new ArrayList(); - for (WorkflowTask cmrTask : cmrAssignedTasks) - { - assignedTasks.add(new JscriptWorkflowTask(cmrTask, this.services, this.getScope())); - } - - Scriptable assignedTasksScriptable = - (Scriptable)new ValueConverter().convertValueForScript(this.services, getScope(), null, assignedTasks); - - return assignedTasksScriptable; - } + */ + private Scriptable getAssignedTasksByState(WorkflowTaskState state) + { + WorkflowService workflowService = services.getWorkflowService(); + String currentUser = services.getAuthenticationService().getCurrentUserName(); + List cmrAssignedTasks = workflowService.getAssignedTasks(currentUser, state); + ArrayList assignedTasks = new ArrayList(); + for (WorkflowTask cmrTask : cmrAssignedTasks) + { + assignedTasks.add(new JscriptWorkflowTask(cmrTask, services, getScope())); + } + return (Scriptable)new ValueConverter().convertValueForScript(services, getScope(), null, assignedTasks); + } }