Compare commits

..

185 Commits

Author SHA1 Message Date
Travis CI User
02db789084 [maven-release-plugin][skip ci] prepare release 11.13 2021-04-21 13:13:43 +00:00
Davide
8035c71ece SEARCH-2748 Bump restapi from 1.57 to 1.58 (#395) 2021-04-21 14:40:05 +02:00
Travis CI User
f2dd2c898c [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-21 10:09:40 +00:00
Travis CI User
ecf4b9bfff [maven-release-plugin][skip ci] prepare release 11.12 2021-04-21 10:09:35 +00:00
Cristian Turlica
a7d8789ff4 Revert "MNT-22186: propTablesCleanupJobDetail v2 can cause Out of Memory errors (CleanAlfPropTablesV2.sql ) (#390)" (#394)
This reverts commit 5e38be6f7d.
2021-04-21 10:22:16 +03:00
Travis CI User
6c69e45b1e [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-20 15:36:28 +00:00
Travis CI User
c35ee50125 [maven-release-plugin][skip ci] prepare release 11.11 2021-04-20 15:36:23 +00:00
Davide
9d711213cc SEARCH-2782 commit time as event time (#377) 2021-04-20 16:32:41 +02:00
Travis CI User
8a6a76d191 [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-19 23:27:19 +00:00
Travis CI User
bd6850d012 [maven-release-plugin][skip ci] prepare release 11.10 2021-04-19 23:27:14 +00:00
dependabot-preview[bot]
fdfb7d170d Bump woodstox-core from 6.2.5 to 6.2.6 (#393) 2021-04-19 22:49:55 +00:00
Travis CI User
33b2a23dfd [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-19 15:48:24 +00:00
Travis CI User
682917b948 [maven-release-plugin][skip ci] prepare release 11.9 2021-04-19 15:48:19 +00:00
Cristian Turlica
5e38be6f7d MNT-22186: propTablesCleanupJobDetail v2 can cause Out of Memory errors (CleanAlfPropTablesV2.sql ) (#390)
- added dialect check to set MySQL specific fetch size limitation (Integer.MIN_VALUE). fetchSize activates result set streaming.
- updated tests
2021-04-19 18:02:19 +03:00
Nithin Nambiar
ef441fc2c8 ACS-276 tas test for rendition version (#358)
* ACS-276 tas test for rendition version
2021-04-15 11:24:41 +01:00
Tom Page
b9e4557973 Add unit test to look for unreferenced test classes. (#368)
* Add unit test to look for unreferenced test classes.

It's very unusual that we write a test class which should not also be added to a test suite.

* Utilise the hierarchy of NonBuildTests markers to help exclude false positives.

* Update all omitted tests.

Either add the test to AllUnitTestsSuite if it passes and runs quickly, mark the test with the appropriate NonBuildTests interface or mark the class as abstract (if appropriate).

Mark one test (RemoteTransformerClientTest) as a NeverRunTest even though it passes because it takes 12 seconds to run and is marked as deprecated.

* Mark two Camel tests as 'never run' because they failed on the CI.
2021-04-15 11:23:26 +01:00
Travis CI User
5d67d39323 [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-14 08:55:55 +00:00
Travis CI User
b32e3fc0b3 [maven-release-plugin][skip ci] prepare release 11.8 2021-04-14 08:55:51 +00:00
Angel Borroy
20dd0efc6f Feature/search 2802 shared secret auth (#382)
* SEARCH-2802: Filter HTTP requests (now "none" and "secret" communication methods are available) from X509 Web Filter.

* SEARCH-2802: HttpClientFactory (for Repository and Search Services clients) support for Shared Secret communication.

* SEARCH-2802: Fix HttpClientFactory base unit tests.
2021-04-14 10:25:45 +02:00
Travis CI User
2a8811a109 [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-09 13:20:38 +00:00
Travis CI User
14902b536a [maven-release-plugin][skip ci] prepare release 11.7 2021-04-09 13:20:34 +00:00
Nana Insaidoo
046116ddf0 Bugfix/repo 5610 events are not actually sent to activemq (#360)
* Add events tests

* Polished put test: connects to JMS via TCP and validate that the event sent is also received back

* Now the tests provides a simple main() that listens on the topic, useful for quick debug sessions

* Now the user name is collected in the calling thread, so that the sendEvent does not silently fails

* Apply changes following review

* Now using queue system to guarantee events order

* Add license

* Updated logs and corrected comments

* Remove empty methods

* Now catering for spurious events at startup when database is bootstrapped

* Now preserving the txn-id in all events

* Moved up definitions in events2.xml after PR feedback

Co-authored-by: Bruno Bossola <bruno@meterian.com>
2021-04-09 13:34:05 +01:00
Travis CI User
0ca611dcfd [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-08 14:15:36 +00:00
Travis CI User
d9daeae665 [maven-release-plugin][skip ci] prepare release 11.6 2021-04-08 14:15:31 +00:00
Davide
65675b9a1d Revert "SEARCH-2782 use current transaction timestamp instead of current time as event time (#371)" (#376)
This reverts commit 28f1429a
2021-04-08 15:02:06 +02:00
dependabot-preview[bot]
dd93088c72 Bump mockito-core from 3.8.0 to 3.9.0 (#375) 2021-04-07 21:42:36 +00:00
Travis CI User
968cae0ee7 [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-07 14:30:30 +00:00
Travis CI User
c28950843a [maven-release-plugin][skip ci] prepare release 11.5 2021-04-07 14:30:25 +00:00
Davide
28f1429a13 SEARCH-2782 use current transaction timestamp instead of current time as event time (#371) 2021-04-07 15:59:24 +02:00
dependabot-preview[bot]
9bd54efc10 Bump woodstox-core from 6.2.4 to 6.2.5 (#374) 2021-04-06 21:56:51 +00:00
Travis CI User
3c81ec949e [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-06 12:52:40 +00:00
Travis CI User
53704a2c58 [maven-release-plugin][skip ci] prepare release 11.4 2021-04-06 12:52:36 +00:00
Alan Davis
b4455f7d60 version.schema=15000 as we will use a 1000 gap for minor versions rather than just 100 2021-04-06 10:25:53 +01:00
dependabot-preview[bot]
0617fbb0bf Bump maven-artifact from 3.6.3 to 3.8.1 (#369) 2021-04-05 21:43:17 +00:00
Travis CI User
f748334f1e [maven-release-plugin][skip ci] prepare for next development iteration 2021-04-05 18:37:13 +00:00
Travis CI User
08748e8af5 [maven-release-plugin][skip ci] prepare release 11.3 2021-04-05 18:37:09 +00:00
evasques
ce62fb1da3 MNT-20500 - Admin console breaks with serialised objects (#291)
* Added macro convertToJSON to recursively parse hashes and enumerables
* Added attempt/recover in macros to handle errors and not break the page
* Changed the output of serialized objects to JSON format
2021-04-05 17:52:15 +01:00
Nicolas Barithel
34f360211f Externalize the nodeServiceCleanup CRON expression (#326) 2021-03-31 10:17:58 +03:00
Travis CI User
b559e78827 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-30 14:29:34 +00:00
Travis CI User
1add8a0f08 [maven-release-plugin][skip ci] prepare release 11.2 2021-03-30 14:29:29 +00:00
dependabot-preview[bot]
5eb8767f4c Bump commons-lang3 from 3.11 to 3.12.0 (#324) 2021-03-30 12:52:05 +00:00
dependabot-preview[bot]
7fb3386413 Bump dependency.jackson.version from 2.12.1 to 2.12.2 (#329) 2021-03-30 12:44:26 +00:00
dependabot-preview[bot]
22945a30ea Bump commons-net from 3.7.2 to 3.8.0 (#304) 2021-03-30 12:14:48 +00:00
dependabot-preview[bot]
fc531f64ed Bump dependency.camel.version from 3.7.0 to 3.7.1 (#303) 2021-03-30 12:14:33 +00:00
dependabot-preview[bot]
98090ac48c Bump dependency.webscripts.version from 8.17 to 8.18 (#337) 2021-03-30 11:54:17 +00:00
dependabot-preview[bot]
33b64f483d Bump dependency.cxf.version from 3.4.2 to 3.4.3 (#354) 2021-03-30 11:50:35 +00:00
dependabot-preview[bot]
5e2d939f4e Bump json from 20201115 to 20210307 (#343) 2021-03-30 11:28:18 +00:00
dependabot-preview[bot]
26dbcd3e79 Bump cmis from 1.27 to 1.29 (#365) 2021-03-30 11:12:32 +00:00
dependabot-preview[bot]
8ec9fa5f5e Bump utility from 3.0.43 to 3.0.44 (#364) 2021-03-30 10:46:44 +00:00
dependabot-preview[bot]
578becd586 Bump joda-time from 2.10.9 to 2.10.10 (#277) 2021-03-30 10:35:59 +00:00
dependabot[bot]
1f6774f2ef Bump junit from 4.13 to 4.13.1 (#271) 2021-03-30 10:19:38 +00:00
dependabot-preview[bot]
9ff5f3b843 Bump dependency.tika.version from 1.25 to 1.26 (#366) 2021-03-29 21:57:43 +00:00
Travis CI User
9e3bb59067 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-29 13:44:23 +00:00
Travis CI User
db47063830 [maven-release-plugin][skip ci] prepare release 11.1 2021-03-29 13:44:19 +00:00
Tom Page
562479bde4 SEARCH-2768 Add callback feature for asynchronous ACL updates. (#357)
* SEARCH-2768 Add callback feature for asynchronous ACL updates.

Change default for user filter to empty, as changes from all users could affect metadata
or permissions which might need to be indexing by the enterprise Elasticsearch Connector.

* SEARCH-2768 Add unit test for new listeners.

* SEARCH-2768 Rename listener callback function.

* SEARCH-2768 Add unit test to test suite.
2021-03-29 11:31:36 +01:00
Travis CI User
e738e0a0fb [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-27 15:37:29 +00:00
Travis CI User
eb703c19aa [maven-release-plugin][skip ci] prepare release 11.0 2021-03-27 15:37:25 +00:00
alandavis
b772205539 Switch master to support ACS 7.1.0
* incremented pom versions to 11.0 so that 9 may be used by 7.0.1 and 10 for 7.0.2
* version.schema not changed this time as it had already been incremented by 100 to 14100
2021-03-27 09:45:27 +00:00
Travis CI User
b9c6b59129 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-26 18:20:03 +00:00
Travis CI User
cf7ce72209 [maven-release-plugin][skip ci] prepare release 9.6 2021-03-26 18:19:57 +00:00
alandavis
7a58014c32 Remove whitesource token as we don't use whitesource any more 2021-03-26 17:40:31 +00:00
Travis CI User
e964aab211 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-26 14:36:33 +00:00
Travis CI User
2d3cac3c27 [maven-release-plugin][skip ci] prepare release 9.5 2021-03-26 14:36:29 +00:00
evasques
3a495f7b3f MNT-22295 - FixedACLJob not processing all nodes due to unordered results (#359)
* Added method selectNodesWithAspects that accepts a boolean as param to order values
* Added param ordered to IdsEntity class
* Added optional ordered param to the query template that orderes the results by node id in asc order
* Added method getNodesWithAspects that accepts a boolean as param to order values in nodeDAO
* FixedACLUpdater Job calls the new getNodesWithAspects, with the ordered param as true
2021-03-26 13:11:32 +00:00
Travis CI User
949e257f19 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-25 23:35:22 +00:00
Travis CI User
dfb34140ac [maven-release-plugin][skip ci] prepare release 9.4 2021-03-25 23:35:17 +00:00
dependabot-preview[bot]
c1f78b1a17 Bump restapi from 1.56 to 1.57 (#361) 2021-03-25 22:58:09 +00:00
Travis CI User
8b70483aa0 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-24 13:30:20 +00:00
Travis CI User
3ebeac2eb4 [maven-release-plugin][skip ci] prepare release 9.3 2021-03-24 13:30:16 +00:00
Denis Ungureanu
d91a552925 ACS-1252 : Add tests for the JMS/ActiveMQ/Camel configuration in *alfresco-community-repo* (#356)
- fix messaging context for tests by adding missing mocked beans and properties
   - add tests for amqp, jms, activemq protocols
   - add travis job to run messaging tests
2021-03-24 14:13:37 +02:00
Bruno Bossola
8c91145b39 Revert "[ACS-1291] Asynchronous mechanism to send events (#351)" as no events are sent to AMQ with this change
This reverts commit f446031069.
2021-03-24 11:53:12 +00:00
Travis CI User
86fcf67016 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-19 16:57:50 +00:00
Travis CI User
48c85ec24f [maven-release-plugin][skip ci] prepare release 9.2 2021-03-19 16:57:46 +00:00
Bruno Bossola
f446031069 [ACS-1291] Asynchronous mechanism to send events (#351)
* Performance optimisation spike

* Event2 is now sending event asynchronously

* Now forcing synchronous calls in tests for events2

* Now qualifying the event service used in tests

Co-authored-by: Nana Insaidoo <insaidoo.nana@yahoo.it>
2021-03-19 11:08:13 +00:00
Travis CI User
a9dabb0e99 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-12 16:47:18 +00:00
Travis CI User
bdc95019ae [maven-release-plugin][skip ci] prepare release 9.1 2021-03-12 16:47:13 +00:00
evasques
ace87c9c3b MNT-21898 Unexpected ACLs when job runs Fix (#344)
* On move node, verify if parent has pending acl aspect applied and consider the pending shared ACL to update inheritance to avoid ending up with mixed permissions as children of pending acl nodes do not have the correct acls and when moved, keep their wrong acl.
* Add public method setInheritanceForChildren that receives an additional param: forceSharedACL. If an unexpected ACL occurs in a child, it can be overridden by setting it.
* Implement method setInheritanceForChildren that receives an additional parameter: forceSharedACL
* Add method setFixedAcls that receives an additional parameter: forceSharedACL - When a child node has an unexpected ACL, setting this parameter to true will force it to assume the new shared ACL instead of throwing a concurrency exception. When the shared ACL is forces, a warning is thrown in the log informing on what node exactly are we forcing the ACL. This is only possible when the child ACL is type SHARED and when it has an unexpected ACL
* All methods that called setFixedAcls without the new parameter will continue to operate as normal, as having forceSharedACL=false
* Added property forceSharedACL to the FixedACLUpdaterJob. If set to true it will force shared ACL to propagate through children even if there is an unexpected ACL
* When there is a exception detected when doing setInheritanceForChildren on the job, catch and log the error, but do not rollback the entire batch
* On copy/move unit tests I changed the ACL of the target folders on copy and move tests so that the old shared ACL accessed was never the same for origin and target folders as happens when performing these operations between sites
* Added unit test to verify fix for MNT-21898 - testAsyncWithNodeMoveChildToChildPendingFolder
* Added unit test to verify system property for the job: forceSharedACL - testAsyncWithErrorsForceSharedACL
2021-03-12 14:25:58 +00:00
Travis CI User
af8b556bf8 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-10 22:59:09 +00:00
Travis CI User
5990fadb3b [maven-release-plugin][skip ci] prepare release 9.0 2021-03-10 22:59:05 +00:00
alandavis
08b62afb10 Missed setting <acs.version.revision>1</acs.version.revision> 2021-03-10 21:51:09 +00:00
alandavis
d4bae73b86 Use master for ACS 7.0.1
* version.schema=14100 (100 more than used for ACS 7.0.0 originally)
* Reset pom.xml version to next major number (9), as 8.x will be used for 7.0.0 HFs
2021-03-10 21:15:34 +00:00
Travis CI User
8ae2009c13 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-09 20:11:25 +00:00
Travis CI User
df92479664 [maven-release-plugin][skip ci] prepare release 8.423 2021-03-09 20:11:20 +00:00
Gloria Camino
40133c350e Fixed typo for CPUs in SPANISH (#342) 2021-03-09 18:24:23 +00:00
Travis CI User
f0c95819ad [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-09 16:14:43 +00:00
Travis CI User
1b3ae47b98 [maven-release-plugin][skip ci] prepare release 8.422 2021-03-09 16:14:38 +00:00
Alan Davis
6a017abf3e ATS-876 Update to T-Engine (#341) 2021-03-09 15:16:01 +00:00
dependabot-preview[bot]
19767d2fc7 Bump dependency.transform.model.version from 1.3.0 to 1.3.1 (#340) 2021-03-08 23:23:51 +00:00
Travis CI User
77935da9df [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-08 17:41:39 +00:00
Travis CI User
80612db4e3 [maven-release-plugin][skip ci] prepare release 8.421 2021-03-08 17:41:34 +00:00
araschitor
49b652f696 feature/APPS-update-dependencies:Removed suffix google-drive and aos version from pom (#339) 2021-03-08 18:57:31 +02:00
Travis CI User
999ce58b43 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-08 08:26:37 +00:00
Travis CI User
2bf41ccfea [maven-release-plugin][skip ci] prepare release 8.420 2021-03-08 08:26:33 +00:00
Denis Ungureanu
f485581d5e ACS-855 : Long running patch in ACS 7.0.0.A2 upgrade (#338)
- revert  changes (sql and schema reference files) done in REPO-4547
2021-03-08 09:49:38 +02:00
Travis CI User
a4bf9b5e47 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-05 12:35:24 +00:00
Travis CI User
1fde058fc4 [maven-release-plugin][skip ci] prepare release 8.419 2021-03-05 12:35:20 +00:00
Alan Davis
d60cd5ed1c ACS-1180 ACS 7 Stacks: MySQL 8 (#336) 2021-03-05 11:37:47 +00:00
Travis CI User
f77ceb2072 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-05 11:09:00 +00:00
Travis CI User
b072d935aa [maven-release-plugin][skip ci] prepare release 8.418 2021-03-05 11:08:55 +00:00
Ancuta Morarasu
2b03e2bbf0 ACS-1201: Model integrity violation saving properties (#332)
- Fix the name property persistence in ContentModelFormProcessor to only save when the property value is actually changed. This prevents the FilenameFilteringInterceptor to be called when there are no changes to the file name.
2021-03-05 12:32:10 +02:00
Travis CI User
748272bcde [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-05 09:20:48 +00:00
Travis CI User
3aad844812 [maven-release-plugin][skip ci] prepare release 8.417 2021-03-05 09:20:44 +00:00
Alan Davis
1e5188a4a7 ACS-1071 revision not set in version.properties (#335) 2021-03-05 08:42:28 +00:00
Andreea Nechifor
f5e5093ead Update webscript version (#333) 2021-03-05 10:05:07 +02:00
Stefan Kopf
fef8cc9256 ACS-1217 Additional option for SOLR to authenticate with a shared secret (#334)
Co-authored-by: Alex Mukha <alex.mukha@alfresco.com>
2021-03-04 18:49:49 +01:00
Travis CI User
2e6b40d8c7 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-04 15:59:43 +00:00
Travis CI User
4be06a5e20 [maven-release-plugin][skip ci] prepare release 8.416 2021-03-04 15:59:36 +00:00
Nithin Nambiar
f7ecb45991 MNT-22184 Add security header for admin console (#323) 2021-03-04 15:21:35 +00:00
Travis CI User
6349b6ff7b [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-04 08:35:16 +00:00
Travis CI User
16c998ca94 [maven-release-plugin][skip ci] prepare release 8.415 2021-03-04 08:35:09 +00:00
Simona C
a37bf29faa Upgrade TAS restapi (#327) 2021-03-04 09:58:20 +02:00
dependabot-preview[bot]
42d14c2abe Bump acs-event-model from 0.0.11 to 0.0.12 (#328)
Bumps [acs-event-model](https://github.com/Alfresco/acs-event-model) from 0.0.11 to 0.0.12.
- [Release notes](https://github.com/Alfresco/acs-event-model/releases)
- [Commits](https://github.com/Alfresco/acs-event-model/commits)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2021-03-04 07:58:35 +01:00
dependabot-preview[bot]
8544f6f90e Bump restapi from 1.55 to 1.56 (#330) 2021-03-03 22:56:39 +00:00
alandavis
ee7936334e Remove commented out section [skip ci] 2021-03-03 17:15:14 +00:00
Travis CI User
50a1e87962 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-03 08:40:10 +00:00
Travis CI User
0266dfec6a [maven-release-plugin][skip ci] prepare release 8.414 2021-03-03 08:40:04 +00:00
Simona C
e0ce4ddf42 Upgrade TAS restapi (#322) 2021-03-03 10:02:15 +02:00
Travis CI User
783f0fad55 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-02 14:55:32 +00:00
Travis CI User
6ea080d819 [maven-release-plugin][skip ci] prepare release 8.413 2021-03-02 14:55:26 +00:00
dhrn
1bb233405b [REPO-5552] TAS aspect/type api (#319)
* [REPO-5552] TAS initial commit

* [REPO-5552] * model added

* [REPO-5552] * revert comments

* [REPO-5552] * minor conventions fixes

* [REPO-5552] * minor update
2021-03-02 19:48:34 +05:30
Travis CI User
9af54e1dc4 [maven-release-plugin][skip ci] prepare for next development iteration 2021-03-02 00:22:34 +00:00
Travis CI User
3a2119cbd2 [maven-release-plugin][skip ci] prepare release 8.412 2021-03-02 00:22:28 +00:00
dependabot-preview[bot]
c3476a725f Bump restapi from 1.52 to 1.53 (#321) 2021-03-01 23:47:05 +00:00
Lucian Tuca
30baf81b44 REPO-5571 (#320)
- fixed minor stuff around logging
2021-03-01 16:42:35 +02:00
Travis CI User
c60268eae4 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-27 10:46:59 +00:00
Travis CI User
5911315c4d [maven-release-plugin][skip ci] prepare release 8.411 2021-02-27 10:46:53 +00:00
alandavis
2294a87908 REPO-5376 Query Accelerator add simple timings, having remove temporary timing code 2021-02-27 09:10:58 +00:00
Travis CI User
75d6722efd [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-26 10:04:16 +00:00
Travis CI User
bcd72f35d0 [maven-release-plugin][skip ci] prepare release 8.410 2021-02-26 10:04:07 +00:00
CezarLeahu
3a5cedd418 Improve the ACS build scripts (#316)
- update build_functions.sh
- remove unnecessary checks during the build
- update the build.sh script in ACS packaging to match a simpler pom.xml
2021-02-26 09:10:40 +00:00
dhrn
5a4fbbe095 * fix random failing erors while calling aspect api (#313) 2021-02-26 08:18:47 +00:00
Travis CI User
fc89ed17f2 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-26 08:08:22 +00:00
Travis CI User
064e87a4aa [maven-release-plugin][skip ci] prepare release 8.409 2021-02-26 08:08:16 +00:00
CezarLeahu
4e50446c4e ACS-1253 Remove ignored JMX namespace argument on Camel context (#314) 2021-02-26 09:33:05 +02:00
Travis CI User
e5ca36936c [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-25 21:21:31 +00:00
Travis CI User
6f0c21662e [maven-release-plugin][skip ci] prepare release 8.408 2021-02-25 21:21:25 +00:00
Andreea Nechifor
1a5aa34d3d APPS-692: after properties removed. (#315) 2021-02-25 20:45:46 +02:00
Travis CI User
24f2737255 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-25 14:35:15 +00:00
Travis CI User
0cbbe42369 [maven-release-plugin][skip ci] prepare release 8.407 2021-02-25 14:35:09 +00:00
Lucian Tuca
1f4666076d REPO-5571 : Add tempFileCleanerTrigger configurations to be able to limit the job duration/quantity
- fixed log order
2021-02-25 15:26:02 +02:00
Travis CI User
d2f21bcea1 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-24 22:05:27 +00:00
Travis CI User
5d3c4730ec [maven-release-plugin][skip ci] prepare release 8.406 2021-02-24 22:05:20 +00:00
Alan Davis
299003c0c7 ACS-1183 ACS 7 Stacks: PostgreSQL (#312)
Upgraded the version of PostgreSQL to the latest (13.1) in all tests that are not db specific
Switch to new Jira url for gitbugtrack
2021-02-24 21:34:29 +00:00
Travis CI User
d0c93dc170 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-24 16:20:17 +00:00
Travis CI User
c1002b8684 [maven-release-plugin][skip ci] prepare release 8.405 2021-02-24 16:20:11 +00:00
Lucian Tuca
43410785a5 REPO-5571 : Add tempFileCleanerTrigger configurations to be able to limit the job duration/quantity
- fixed log order
2021-02-24 17:30:13 +02:00
Travis CI User
2cbd2d98d4 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-24 13:09:43 +00:00
Travis CI User
1b768f7c1f [maven-release-plugin][skip ci] prepare release 8.404 2021-02-24 13:09:37 +00:00
araschitor
cdec3f2f93 fix/APPS-update-webscripts-version to 8.17 (#309) 2021-02-24 12:54:32 +02:00
Travis CI User
aaf2b32d02 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-23 17:28:05 +00:00
Travis CI User
2ce490042a [maven-release-plugin][skip ci] prepare release 8.403 2021-02-23 17:27:59 +00:00
Alan Davis
e88aab47f7 REPO-5376 Query Accelerator Remove all temporary code (#308)
* REPO-5376 Remove all temporary code

* Remove DBStats, SingleTaskRestartableWatch
* Remove propertiesCache and aspectsCache from DBQueryEngine as they were marked as temporary

* Remove further temporary code

Co-authored-by: Nana Insaidoo <insaidoo.nana@yahoo.it>
2021-02-23 16:54:58 +00:00
Travis CI User
36937eaad7 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-23 14:11:56 +00:00
Travis CI User
2f043eac24 [maven-release-plugin][skip ci] prepare release 8.402 2021-02-23 14:11:50 +00:00
Lucian Tuca
fd7adefe27 REPO-5571 : Change the tempFileCleanerTrigger to work at scale (#293)
* REPO-5571 : Change the tempFileCleanerTrigger to work at scale
    - added nr of files and time limits
2021-02-23 15:41:06 +02:00
dhrn
cf91e0afe0 [REPO-5552] more filtering capablities for aspect/type api (#301)
* * more filter implementation

* * aspect test case added

* * fixed coments and more test added

* * comments fixed

* * import and data fixed

* * removed unnecessary filter
2021-02-23 09:56:37 +00:00
Travis CI User
1394442d93 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-22 23:22:54 +00:00
Travis CI User
9f00a2b561 [maven-release-plugin][skip ci] prepare release 8.401 2021-02-22 23:22:47 +00:00
dependabot-preview[bot]
07891b2765 Bump mockito-core from 3.7.7 to 3.8.0 (#307) 2021-02-22 22:51:55 +00:00
Travis CI User
2a781e364b [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-19 13:24:49 +00:00
Travis CI User
95ab60263c [maven-release-plugin][skip ci] prepare release 8.400 2021-02-19 13:24:43 +00:00
CezarLeahu
bb19c61253 ACS-1253 Enable Camel JMX management (#295)
- enable Camel JMX management
- disable JMS connection idle timeout
- update ActiveMQ broker URLs to use NIO
2021-02-19 14:52:28 +02:00
dependabot-preview[bot]
6f17779e71 Bump postgresql from 42.2.18 to 42.2.19 (#300) 2021-02-19 11:24:41 +00:00
Andrea Gazzarini
1ff90242b0 [SEARCH-2677] Extract SearchEngineResultSet and SearchEngineResultMetadata interfaces (#286)
* [SEARCH-2677] Extract SearchEngineResultSet and SearchEngineResultMetadata interfaces

* [SEARCH-2677] Scan the ResultSet decorator chain for a maximum of 3 nested levels
2021-02-19 12:09:40 +01:00
Travis CI User
de18900d90 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-19 08:18:49 +00:00
Travis CI User
6360842e06 [maven-release-plugin][skip ci] prepare release 8.399 2021-02-19 08:18:43 +00:00
Cristian Turlica
cee63b31f6 ACS-1264: Content Model changes dynamically updated to node caches across a cluster deadlock (#289)
Re enabling changes disabled in ACS-936 we can see deadlock in asynchronouslyRefreshedCacheThreadPool (AbstractAsynchronouslyRefreshedCache).

There should be no reason to be calling the dictionary destroy method before the doCall() finishes...and it is the use of the destroy method that carries the risk of deadlock.

Proposed fix: the liveLock is used for the doCall() method, this will stop deadlock from external calls to dictionary destroy() while doCall is in progress.

Removed invalidating cache fix (e.g. fix the issue where cluster nodes could have null values and other nodes had default value… so no properly invalidated on node startup). This fix was moved in ClusteringBootstrap as the initial one was causing issues.
2021-02-19 09:46:36 +02:00
Travis CI User
a74cdea223 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-19 00:12:19 +00:00
Travis CI User
e53b61d2bf [maven-release-plugin][skip ci] prepare release 8.398 2021-02-19 00:12:14 +00:00
alandavis
cad18795fd ATS-862 2.3.8 T-Engines
* T-Engines 2.3.8
2021-02-18 23:41:21 +00:00
Travis CI User
14e43ed825 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-18 20:21:58 +00:00
Travis CI User
2ee3b2085a [maven-release-plugin][skip ci] prepare release 8.397 2021-02-18 20:21:52 +00:00
Alan Davis
a7935b0d08 ATS-862 2.3.8 T-Engines (#297)
* alfresco-transform-model 1.3.0
2021-02-18 19:50:59 +00:00
Travis CI User
557a666fd3 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-18 19:50:25 +00:00
Travis CI User
87b7d524ea [maven-release-plugin][skip ci] prepare release 8.396 2021-02-18 19:50:19 +00:00
pieCit87
43daee3529 Feature/apps 703 bumps gdrive 3.2.1-A2 (#296) 2021-02-18 21:19:59 +02:00
Travis CI User
c9638187a1 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-18 09:49:44 +00:00
Travis CI User
659e2baef8 [maven-release-plugin][skip ci] prepare release 8.395 2021-02-18 09:49:38 +00:00
pieCit87
fb2035e82e bump of gdrive and aos (#294) 2021-02-18 11:16:40 +02:00
Travis CI User
366e4b23bf [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-17 17:18:52 +00:00
Travis CI User
5f6734728e [maven-release-plugin][skip ci] prepare release 8.394 2021-02-17 17:18:46 +00:00
David Edwards
371791a0bb ACS-1185 Update activemq image to 5.16.1 (#292) 2021-02-17 16:49:12 +00:00
Travis CI User
b2d65d6ac1 [maven-release-plugin][skip ci] prepare for next development iteration 2021-02-15 16:57:56 +00:00
Travis CI User
69dca8f852 [maven-release-plugin][skip ci] prepare release 8.393 2021-02-15 16:57:50 +00:00
dhrn
4d73b11d12 show model in the aspect/type api (#285) 2021-02-15 11:29:23 +00:00
176 changed files with 6616 additions and 2052 deletions

View File

@@ -1,4 +1,4 @@
# For SmartGit
[bugtraq "jira"]
url = https://issues.alfresco.com/jira/browse/%BUGID%
url = https://alfresco.atlassian.net/browse/%BUGID%
logRegex = ([A-Z]+-\\d+)

View File

@@ -25,10 +25,6 @@
<url>https://artifacts.alfresco.com/nexus/content/groups/public</url>
</pluginRepository>
</pluginRepositories>
<properties>
<!-- WhiteSource token -->
<org.whitesource.orgToken>${env.WHITESOURCE_API_KEY}</org.whitesource.orgToken>
</properties>
</profile>
</profiles>

View File

@@ -33,7 +33,7 @@ stages:
- name: test
if: commit_message !~ /\[skip tests\]/
- name: release
if: fork = false AND (branch = master OR branch =~ /release\/.*/ OR branch =~/fix\/.*/) AND type != pull_request AND commit_message !~ /\[no release\]/
if: fork = false AND (branch = master OR branch =~ /release\/.*/) AND type != pull_request AND commit_message !~ /\[no release\]/
- name: update_downstream
if: fork = false AND (branch = master OR branch =~ /release\/.*/) AND type != pull_request AND commit_message !~ /\[no downstream\]/
- name: trigger_downstream
@@ -45,13 +45,6 @@ install: travis_retry travis_wait 40 bash scripts/travis/build.sh
jobs:
include:
# - name: "Source Clear Scan"
# # only on release branches or master and if it is not a PR
# if: fork = false AND (branch = master OR branch =~ /release\/.*/) AND type != pull_request
# script: skip
# addons:
# srcclr: true
- name: "Core, Data-Model, Repository - AllUnitTestsSuite - Build and test"
script:
- travis_retry mvn -B test -pl core,data-model
@@ -59,35 +52,35 @@ jobs:
- name: "Repository - AppContext01TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.6
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.10
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContext01TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - AppContext02TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContext02TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - AppContext03TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.6
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.10
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContext03TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - AppContext04TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.6
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.10
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContext04TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - AppContext05TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- mkdir -p "${HOME}/tmp"
- cp repository/src/test/resources/realms/alfresco-realm.json "${HOME}/tmp"
- export HOST_IP=$(hostname -I | cut -f1 -d' ')
@@ -96,126 +89,130 @@ jobs:
- name: "Repository - AppContext06TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.6
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.10
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContext06TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - AppContextExtraTestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.6
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.10
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContextExtraTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - MiscContextTestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.6
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.10
script: travis_wait 20 mvn -B test -pl repository -Dtest=MiscContextTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - SearchTestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=SearchTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco -Dindex.subsystem.name=solr6
- name: "Repository - MariaDB 10.2.18 tests"
if: commit_message !~ /\[skip db\]/
before_script:
- docker run -d -p 3307:3306 --name mariadb -e MYSQL_ROOT_PASSWORD=alfresco -e MYSQL_USER=alfresco -e MYSQL_DATABASE=alfresco -e MYSQL_PASSWORD=alfresco mariadb:10.2.18 --transaction-isolation=READ-COMMITTED --max-connections=300 --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.name=alfresco -Ddb.url=jdbc:mariadb://localhost:3307/alfresco?useUnicode=yes\&characterEncoding=UTF-8 -Ddb.username=alfresco -Ddb.password=alfresco -Ddb.driver=org.mariadb.jdbc.Driver
- name: "Repository - MariaDB 10.4 tests"
if: commit_message !~ /\[skip db\]/
before_script:
- docker run -d -p 3307:3306 --name mariadb -e MYSQL_ROOT_PASSWORD=alfresco -e MYSQL_USER=alfresco -e MYSQL_DATABASE=alfresco -e MYSQL_PASSWORD=alfresco mariadb:10.4 --transaction-isolation=READ-COMMITTED --max-connections=300 --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.name=alfresco -Ddb.url=jdbc:mariadb://localhost:3307/alfresco?useUnicode=yes\&characterEncoding=UTF-8 -Ddb.username=alfresco -Ddb.password=alfresco -Ddb.driver=org.mariadb.jdbc.Driver
- name: "Repository - MariaDB 10.5 tests"
if: commit_message !~ /\[skip db\]/
before_script:
- docker run -d -p 3307:3306 --name mariadb -e MYSQL_ROOT_PASSWORD=alfresco -e MYSQL_USER=alfresco -e MYSQL_DATABASE=alfresco -e MYSQL_PASSWORD=alfresco mariadb:10.5 --transaction-isolation=READ-COMMITTED --max-connections=300 --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.name=alfresco -Ddb.url=jdbc:mariadb://localhost:3307/alfresco?useUnicode=yes\&characterEncoding=UTF-8 -Ddb.username=alfresco -Ddb.password=alfresco -Ddb.driver=org.mariadb.jdbc.Driver
- name: "Repository - MySQL 5.7.23 tests"
if: commit_message !~ /\[skip db\]/
before_script:
- docker run -d -p 3307:3306 -e MYSQL_ROOT_PASSWORD=alfresco -e MYSQL_USER=alfresco -e MYSQL_DATABASE=alfresco -e MYSQL_PASSWORD=alfresco mysql:5.7.23 --transaction-isolation='READ-COMMITTED'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.driver=com.mysql.jdbc.Driver -Ddb.name=alfresco -Ddb.url=jdbc:mysql://localhost:3307/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
# One failing test to do with the schema reference files. ACS-1180
# - name: "Repository - MySQL 8 tests"
# if: commit_message !~ /\[skip db\]/
# before_script:
# - docker run -d -p 3307:3306 -e MYSQL_ROOT_PASSWORD=alfresco -e MYSQL_USER=alfresco -e MYSQL_DATABASE=alfresco -e MYSQL_PASSWORD=alfresco mysql:8 --transaction-isolation='READ-COMMITTED'
# - docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
# script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.driver=com.mysql.jdbc.Driver -Ddb.name=alfresco -Ddb.url=jdbc:mysql://localhost:3307/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - MySQL 8 tests"
if: commit_message !~ /\[skip db\]/
before_script:
- docker run -d -p 3307:3306 -e MYSQL_ROOT_PASSWORD=alfresco -e MYSQL_USER=alfresco -e MYSQL_DATABASE=alfresco -e MYSQL_PASSWORD=alfresco mysql:8 --transaction-isolation='READ-COMMITTED'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.driver=com.mysql.jdbc.Driver -Ddb.name=alfresco -Ddb.url=jdbc:mysql://localhost:3307/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - PostgreSQL 10.9 tests"
if: commit_message !~ /\[skip db\]/
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:10.9 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - PostgreSQL 11.7 tests"
if: commit_message !~ /\[skip db\]/
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - PostgreSQL 12.4 tests"
if: commit_message !~ /\[skip db\]/
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:12.4 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - PostgreSQL 13.1 tests"
if: commit_message !~ /\[skip db\]/
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - Messaging tests"
before_script:
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=CamelRoutesTest,CamelComponentsTest
- name: "Remote-api - AppContext01TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl remote-api -Dtest=AppContext01TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Remote-api - AppContext02TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.6
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.10
script: travis_wait 20 mvn -B test -pl remote-api -Dtest=AppContext02TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Remote-api - AppContext03TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.6
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.10
script: travis_wait 20 mvn -B test -pl remote-api -Dtest=AppContext03TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Remote-api - AppContext04TestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.5
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.3.10
script: travis_wait 20 mvn -B test -pl remote-api -Dtest=AppContext04TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Remote-api - AppContextExtraTestSuite"
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:11.7 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.15.8
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.1 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl remote-api -Dtest=AppContextExtraTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "REST API TAS tests part1"

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<dependencies>

View File

@@ -21,7 +21,6 @@ package org.alfresco.httpclient;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.security.AlgorithmParameters;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
@@ -32,14 +31,11 @@ import java.util.concurrent.locks.ReentrantReadWriteLock;
import org.alfresco.encryption.AlfrescoKeyStore;
import org.alfresco.encryption.AlfrescoKeyStoreImpl;
import org.alfresco.encryption.EncryptionUtils;
import org.alfresco.encryption.Encryptor;
import org.alfresco.encryption.KeyProvider;
import org.alfresco.encryption.KeyResourceLoader;
import org.alfresco.encryption.KeyStoreParameters;
import org.alfresco.encryption.ssl.AuthSSLProtocolSocketFactory;
import org.alfresco.encryption.ssl.SSLEncryptionParameters;
import org.alfresco.error.AlfrescoRuntimeException;
import org.alfresco.util.Pair;
import org.apache.commons.httpclient.DefaultHttpMethodRetryHandler;
import org.apache.commons.httpclient.HostConfiguration;
import org.apache.commons.httpclient.HttpClient;
@@ -53,8 +49,6 @@ import org.apache.commons.httpclient.SimpleHttpConnectionManager;
import org.apache.commons.httpclient.URI;
import org.apache.commons.httpclient.URIException;
import org.apache.commons.httpclient.cookie.CookiePolicy;
import org.apache.commons.httpclient.methods.ByteArrayRequestEntity;
import org.apache.commons.httpclient.methods.PostMethod;
import org.apache.commons.httpclient.params.DefaultHttpParams;
import org.apache.commons.httpclient.params.DefaultHttpParamsFactory;
import org.apache.commons.httpclient.params.HttpClientParams;
@@ -75,23 +69,25 @@ import org.apache.commons.logging.LogFactory;
*/
public class HttpClientFactory
{
/**
* Communication type for HttpClient:
* - NONE is plain http
* - SECRET is plain http with a shared secret via request header
* - HTTPS is mTLS with client authentication (certificates are required)
*/
public static enum SecureCommsType
{
HTTPS, NONE;
HTTPS, NONE, SECRET;
public static SecureCommsType getType(String type)
{
if(type.equalsIgnoreCase("https"))
switch (type.toLowerCase())
{
return HTTPS;
}
else if(type.equalsIgnoreCase("none"))
{
return NONE;
}
else
{
throw new IllegalArgumentException("Invalid communications type");
case "https": return HTTPS;
case "none": return NONE;
case "secret": return SECRET;
default: throw new IllegalArgumentException("Invalid communications type");
}
}
};
@@ -122,14 +118,24 @@ public class HttpClientFactory
private int connectionTimeout = 0;
// Shared secret parameters
private String sharedSecret;
private String sharedSecretHeader = DEFAULT_SHAREDSECRET_HEADER;
// Default name for HTTP Request Header when using shared secret communication
public static final String DEFAULT_SHAREDSECRET_HEADER = "X-Alfresco-Search-Secret";
public HttpClientFactory()
{
}
/**
* Default constructor for legacy subsystems.
*/
public HttpClientFactory(SecureCommsType secureCommsType, SSLEncryptionParameters sslEncryptionParameters,
KeyResourceLoader keyResourceLoader, KeyStoreParameters keyStoreParameters,
MD5EncryptionParameters encryptionParameters, String host, int port, int sslPort, int maxTotalConnections,
int maxHostConnections, int socketTimeout)
KeyResourceLoader keyResourceLoader, KeyStoreParameters keyStoreParameters,
MD5EncryptionParameters encryptionParameters, String host, int port, int sslPort,
int maxTotalConnections, int maxHostConnections, int socketTimeout)
{
this.secureCommsType = secureCommsType;
this.sslEncryptionParameters = sslEncryptionParameters;
@@ -145,6 +151,21 @@ public class HttpClientFactory
init();
}
/**
* Recommended constructor for subsystems supporting Shared Secret communication.
* This constructor supports Shared Secret ("secret") communication method additionally to the legacy ones: "none" and "https".
*/
public HttpClientFactory(SecureCommsType secureCommsType, SSLEncryptionParameters sslEncryptionParameters,
KeyResourceLoader keyResourceLoader, KeyStoreParameters keyStoreParameters,
MD5EncryptionParameters encryptionParameters, String sharedSecret, String sharedSecretHeader,
String host, int port, int sslPort, int maxTotalConnections, int maxHostConnections, int socketTimeout)
{
this(secureCommsType, sslEncryptionParameters, keyResourceLoader, keyStoreParameters, encryptionParameters,
host, port, sslPort, maxTotalConnections, maxHostConnections, socketTimeout);
this.sharedSecret = sharedSecret;
this.sharedSecretHeader = sharedSecretHeader;
}
public void init()
{
this.sslKeyStore = new AlfrescoKeyStoreImpl(sslEncryptionParameters.getKeyStoreParameters(), keyResourceLoader);
@@ -272,10 +293,44 @@ public class HttpClientFactory
this.connectionTimeout = connectionTimeout;
}
protected HttpClient constructHttpClient()
/**
* Shared secret used for SECRET communication
* @param secret shared secret word
*/
public void setSharedSecret(String sharedSecret)
{
this.sharedSecret = sharedSecret;
}
/**
* @return Shared secret used for SECRET communication
*/
public String getSharedSecret()
{
return sharedSecret;
}
/**
* HTTP Request header used for SECRET communication
* @param sharedSecretHeader HTTP Request header
*/
public void setSharedSecretHeader(String sharedSecretHeader)
{
this.sharedSecretHeader = sharedSecretHeader;
}
/**
* @return HTTP Request header used for SECRET communication
*/
public String getSharedSecretHeader()
{
return sharedSecretHeader;
}
protected RequestHeadersHttpClient constructHttpClient()
{
MultiThreadedHttpConnectionManager connectionManager = new MultiThreadedHttpConnectionManager();
HttpClient httpClient = new HttpClient(connectionManager);
RequestHeadersHttpClient httpClient = new RequestHeadersHttpClient(connectionManager);
HttpClientParams params = httpClient.getParams();
params.setBooleanParameter(HttpConnectionParams.TCP_NODELAY, true);
params.setBooleanParameter(HttpConnectionParams.STALE_CONNECTION_CHECK, true);
@@ -291,15 +346,15 @@ public class HttpClientFactory
return httpClient;
}
protected HttpClient getHttpsClient()
protected RequestHeadersHttpClient getHttpsClient()
{
return getHttpsClient(host, sslPort);
}
protected HttpClient getHttpsClient(String httpsHost, int httpsPort)
protected RequestHeadersHttpClient getHttpsClient(String httpsHost, int httpsPort)
{
// Configure a custom SSL socket factory that will enforce mutual authentication
HttpClient httpClient = constructHttpClient();
RequestHeadersHttpClient httpClient = constructHttpClient();
// Default port is 443 for the HostFactory, when including customised port (like 8983) the port name is skipped from "getHostURL" string
HttpHostFactory hostFactory = new HttpHostFactory(new Protocol("https", sslSocketFactory, HttpsURL.DEFAULT_PORT));
httpClient.setHostConfiguration(new HostConfigurationWithHostFactory(hostFactory));
@@ -307,28 +362,54 @@ public class HttpClientFactory
return httpClient;
}
protected HttpClient getDefaultHttpClient()
protected RequestHeadersHttpClient getDefaultHttpClient()
{
return getDefaultHttpClient(host, port);
}
protected HttpClient getDefaultHttpClient(String httpHost, int httpPort)
protected RequestHeadersHttpClient getDefaultHttpClient(String httpHost, int httpPort)
{
HttpClient httpClient = constructHttpClient();
RequestHeadersHttpClient httpClient = constructHttpClient();
httpClient.getHostConfiguration().setHost(httpHost, httpPort);
return httpClient;
}
/**
* Build HTTP Client using default headers
* @return RequestHeadersHttpClient including default header for shared secret method
*/
protected RequestHeadersHttpClient constructSharedSecretHttpClient()
{
RequestHeadersHttpClient client = constructHttpClient();
client.setDefaultHeaders(Map.of(sharedSecretHeader, sharedSecret));
return client;
}
protected RequestHeadersHttpClient getSharedSecretHttpClient()
{
return getSharedSecretHttpClient(host, port);
}
protected RequestHeadersHttpClient getSharedSecretHttpClient(String httpHost, int httpPort)
{
RequestHeadersHttpClient httpClient = constructSharedSecretHttpClient();
httpClient.getHostConfiguration().setHost(httpHost, httpPort);
return httpClient;
}
protected AlfrescoHttpClient getAlfrescoHttpsClient()
{
AlfrescoHttpClient repoClient = new HttpsClient(getHttpsClient());
return repoClient;
return new HttpsClient(getHttpsClient());
}
protected AlfrescoHttpClient getAlfrescoHttpClient()
{
AlfrescoHttpClient repoClient = new DefaultHttpClient(getDefaultHttpClient());
return repoClient;
return new DefaultHttpClient(getDefaultHttpClient());
}
protected AlfrescoHttpClient getAlfrescoSharedSecretClient()
{
return new DefaultHttpClient(getSharedSecretHttpClient());
}
protected HttpClient getMD5HttpClient(String host, int port)
@@ -341,66 +422,37 @@ public class HttpClientFactory
public AlfrescoHttpClient getRepoClient(String host, int port)
{
AlfrescoHttpClient repoClient = null;
if(secureCommsType == SecureCommsType.HTTPS)
switch (secureCommsType)
{
repoClient = getAlfrescoHttpsClient();
case HTTPS: return getAlfrescoHttpsClient();
case NONE: return getAlfrescoHttpClient();
case SECRET: return getAlfrescoSharedSecretClient();
default: throw new AlfrescoRuntimeException("Invalid Solr secure communications type configured in [solr|alfresco].secureComms, should be 'ssl', 'none' or 'secret'");
}
else if(secureCommsType == SecureCommsType.NONE)
}
public RequestHeadersHttpClient getHttpClient()
{
switch (secureCommsType)
{
repoClient = getAlfrescoHttpClient();
case HTTPS: return getHttpsClient();
case NONE: return getDefaultHttpClient();
case SECRET: return getSharedSecretHttpClient();
default: throw new AlfrescoRuntimeException("Invalid Solr secure communications type configured in [solr|alfresco].secureComms, should be 'ssl', 'none' or 'secret'");
}
else
{
throw new AlfrescoRuntimeException("Invalid Solr secure communications type configured in alfresco.secureComms, should be 'ssl'or 'none'");
}
return repoClient;
}
public HttpClient getHttpClient()
public RequestHeadersHttpClient getHttpClient(String host, int port)
{
HttpClient httpClient = null;
if(secureCommsType == SecureCommsType.HTTPS)
switch (secureCommsType)
{
httpClient = getHttpsClient();
case HTTPS: return getHttpsClient(host, port);
case NONE: return getDefaultHttpClient(host, port);
case SECRET: return getSharedSecretHttpClient(host, port);
default: throw new AlfrescoRuntimeException("Invalid Solr secure communications type configured in [solr|alfresco].secureComms, should be 'ssl', 'none' or 'secret'");
}
else if(secureCommsType == SecureCommsType.NONE)
{
httpClient = getDefaultHttpClient();
}
else
{
throw new AlfrescoRuntimeException("Invalid Solr secure communications type configured in alfresco.secureComms, should be 'ssl'or 'none'");
}
return httpClient;
}
public HttpClient getHttpClient(String host, int port)
{
HttpClient httpClient = null;
if(secureCommsType == SecureCommsType.HTTPS)
{
httpClient = getHttpsClient(host, port);
}
else if(secureCommsType == SecureCommsType.NONE)
{
httpClient = getDefaultHttpClient(host, port);
}
else
{
throw new AlfrescoRuntimeException("Invalid Solr secure communications type configured in alfresco.secureComms, should be 'ssl'or 'none'");
}
return httpClient;
}
/**
* A secure client connection to the repository.
*

View File

@@ -0,0 +1,87 @@
/*
* Copyright (C) 2005-2021 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.httpclient;
import java.io.IOException;
import java.util.Map;
import org.apache.commons.httpclient.HostConfiguration;
import org.apache.commons.httpclient.HttpClient;
import org.apache.commons.httpclient.HttpException;
import org.apache.commons.httpclient.HttpMethod;
import org.apache.commons.httpclient.HttpState;
import org.apache.commons.httpclient.MultiThreadedHttpConnectionManager;
/**
* Since Apache HttpClient 3.1 doesn't support including custom headers by default,
* this class is adding that custom headers every time a method is invoked.
*/
public class RequestHeadersHttpClient extends HttpClient
{
private Map<String, String> defaultHeaders;
public RequestHeadersHttpClient(MultiThreadedHttpConnectionManager connectionManager)
{
super(connectionManager);
}
public Map<String, String> getDefaultHeaders()
{
return defaultHeaders;
}
public void setDefaultHeaders(Map<String, String> defaultHeaders)
{
this.defaultHeaders = defaultHeaders;
}
private void addDefaultHeaders(HttpMethod method)
{
if (defaultHeaders != null)
{
defaultHeaders.forEach((k,v) -> {
method.addRequestHeader(k, v);
});
}
}
@Override
public int executeMethod(HttpMethod method) throws IOException, HttpException
{
addDefaultHeaders(method);
return super.executeMethod(method);
}
@Override
public int executeMethod(HostConfiguration hostConfiguration, HttpMethod method) throws IOException, HttpException
{
addDefaultHeaders(method);
return super.executeMethod(hostConfiguration, method);
}
@Override
public int executeMethod(HostConfiguration hostconfig, HttpMethod method, HttpState state)
throws IOException, HttpException
{
addDefaultHeaders(method);
return super.executeMethod(hostconfig, method, state);
}
}

View File

@@ -24,8 +24,10 @@ import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.time.Duration;
import java.util.concurrent.atomic.AtomicLong;
import org.alfresco.api.AlfrescoPublicApi;
import org.alfresco.api.AlfrescoPublicApi;
import org.alfresco.error.AlfrescoRuntimeException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
@@ -348,6 +350,17 @@ public class TempFileProvider
{
public static final String KEY_PROTECT_HOURS = "protectHours";
public static final String KEY_DIRECTORY_NAME = "directoryName";
public static final String KEY_MAX_FILES_TO_DELETE = "maxFilesToDelete";
public static final String KEY_MAX_TIME_TO_RUN = "maxTimeToRun";
/** The time when the job has actually started */
private static long jobStartTime;
/** The maximum number of files that can be deleted when the cleaning jobs runs */
private static AtomicLong maxFilesToDelete;
/** The maximum time allowed for the cleaning job to run */
private static Duration maxTimeToRun;
/**
* Gets a list of all files in the {@link TempFileProvider#ALFRESCO_TEMP_FILE_DIR temp directory}
@@ -376,24 +389,59 @@ public class TempFileProvider
}
String directoryName = (String) context.getJobDetail().getJobDataMap().get(KEY_DIRECTORY_NAME);
try
{
final Object oMaxFilesToDelete = context.getJobDetail().getJobDataMap().get(KEY_MAX_FILES_TO_DELETE);
if (oMaxFilesToDelete != null)
{
final String strMaxFilesToDelete = (String) oMaxFilesToDelete;
maxFilesToDelete = new AtomicLong(Long.parseLong(strMaxFilesToDelete));
logger.debug("Set the maximum number of temp files to be deleted to: " + maxFilesToDelete.get());
}
else
{
logger.debug("No maximum number of files was configured for the temp file clean job.");
}
}
catch (Exception e)
{
logger.warn(e);
throw new JobExecutionException("Invalid job data, maxFilesToDelete");
}
try
{
final Object oMaxTimeToRun = context.getJobDetail().getJobDataMap().get(KEY_MAX_TIME_TO_RUN);
if (oMaxTimeToRun != null)
{
final String strMaxTimeToRun = (String) oMaxTimeToRun;
maxTimeToRun = Duration.parse(strMaxTimeToRun);
logger.debug("Set the maximum duration time of the temp file clean job to: " + maxTimeToRun);
}
else
{
logger.debug("No maximum duration was configured for the temp file clean job.");
}
}
catch (Exception e)
{
logger.warn(e);
throw new JobExecutionException("Invalid job data, maxTimeToRun");
}
if (directoryName == null)
{
directoryName = ALFRESCO_TEMP_FILE_DIR;
}
long now = System.currentTimeMillis();
long aFewHoursBack = now - (3600L * 1000L * protectHours);
long aLongTimeBack = now - (24 * 3600L * 1000L);
jobStartTime = System.currentTimeMillis();
long aFewHoursBack = jobStartTime - (3600L * 1000L * protectHours);
long aLongTimeBack = jobStartTime - (24 * 3600L * 1000L);
File tempDir = TempFileProvider.getTempDir(directoryName);
int count = removeFiles(tempDir, aFewHoursBack, aLongTimeBack, false); // don't delete this directory
// done
if (logger.isDebugEnabled())
{
logger.debug("Removed " + count + " files from temp directory: " + tempDir);
}
logger.debug("Removed " + count + " files from temp directory: " + tempDir);
}
/**
@@ -429,29 +477,23 @@ public class TempFileProvider
}
// list all files
File[] files = directory.listFiles();
File[] filesToIterate = files != null ? files : new File[0];
int count = 0;
for (File file : files)
for (File file : filesToIterate)
{
if (shouldTheDeletionStop())
{
break;
}
if (file.isDirectory())
{
if(isLongLifeTempDir(file))
{
// long life for this folder and its children
int countRemoved = removeFiles(file, longLifeBefore, longLifeBefore, true);
if (logger.isDebugEnabled())
{
logger.debug("Removed " + countRemoved + " files from temp directory: " + file);
}
}
else
{
// enter subdirectory and clean it out and remove itsynetics
int countRemoved = removeFiles(file, removeBefore, longLifeBefore, true);
if (logger.isDebugEnabled())
{
logger.debug("Removed " + countRemoved + " files from directory: " + file);
}
}
// long life for this folder and its children
// OR
// enter subdirectory and clean it out and remove itsynetics
int countRemoved = removeFiles(file,
isLongLifeTempDir(file) ? longLifeBefore : removeBefore, longLifeBefore,
true);
logger.debug("Removed " + countRemoved + " files from " + (isLongLifeTempDir(file) ? "temp " : " ") + "directory: " + file);
}
else
{
@@ -464,11 +506,19 @@ public class TempFileProvider
// it is a file - attempt a delete
try
{
if(logger.isDebugEnabled())
{
logger.debug("Deleting temp file: " + file);
}
logger.debug("Deleting temp file: " + file);
file.delete();
if (maxFilesToDelete != null)
{
maxFilesToDelete.decrementAndGet();
logger.debug(maxFilesToDelete.get() + " files left to delete.");
}
if (maxTimeToRun != null)
{
logger.debug((jobStartTime + maxTimeToRun.toMillis() - System.currentTimeMillis()) + " millis left to delete.");
}
count++;
}
catch (Throwable e)
@@ -487,10 +537,8 @@ public class TempFileProvider
if(listing != null && listing.length == 0)
{
// directory is empty
if(logger.isDebugEnabled())
{
logger.debug("Deleting empty directory: " + directory);
}
logger.debug("Deleting empty directory: " + directory);
// ignore the limits for empty directories that just need cleanup
directory.delete();
}
}
@@ -499,8 +547,21 @@ public class TempFileProvider
logger.info("Failed to remove temp directory: " + directory, e);
}
}
// done
return count;
}
/**
* Decides whether or not the job should continue iterating through the temp files and delete.
* It achieves the result by checking the number of files deleted against the limit and whether
* or not it is within the time limit
*
* @return true or false
*/
private static boolean shouldTheDeletionStop()
{
return maxFilesToDelete != null && maxFilesToDelete.get() <= 0
|| maxTimeToRun != null && ((jobStartTime + maxTimeToRun.toMillis()) < System
.currentTimeMillis());
}
}
}

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<properties>
@@ -167,7 +167,7 @@
<dependency>
<groupId>com.fasterxml.woodstox</groupId>
<artifactId>woodstox-core</artifactId>
<version>6.2.4</version>
<version>6.2.6</version>
</dependency>
<!-- the cxf libs were updated, see dependencyManagement section -->

View File

@@ -9,6 +9,6 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-packaging</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
</project>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-packaging</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<properties>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<profiles>

View File

@@ -1,4 +1,4 @@
TRANSFORMERS_TAG=2.3.6
TRANSFORMERS_TAG=2.3.10
SOLR6_TAG=2.0.1
POSTGRES_TAG=11.7
ACTIVEMQ_TAG=5.15.8
POSTGRES_TAG=13.1
ACTIVEMQ_TAG=5.16.1

View File

@@ -38,7 +38,7 @@ services:
-Dftp.dataPortTo=30099
-Dshare.host=localhost
-Daos.baseUrlOverwrite=http://localhost:8082/alfresco/aos
-Dmessaging.broker.url=\"failover:(tcp://activemq:61616)?timeout=3000&jms.useCompression=true\"
-Dmessaging.broker.url=\"failover:(nio://activemq:61616)?timeout=3000&jms.useCompression=true\"
-DlocalTransform.core-aio.url=http://transform-core-aio:8090/
-Dimap.server.port=1143
-Dftp.port=1221

View File

@@ -38,7 +38,7 @@ services:
-Dftp.dataPortTo=30099
-Dshare.host=localhost
-Daos.baseUrlOverwrite=http://localhost:8082/alfresco/aos
-Dmessaging.broker.url=\"failover:(tcp://activemq:61616)?timeout=3000&jms.useCompression=true\"
-Dmessaging.broker.url=\"failover:(nio://activemq:61616)?timeout=3000&jms.useCompression=true\"
-Dlocal.transform.service.enabled=false
-Dlegacy.transform.service.enabled=false
-Dimap.server.port=1143

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-packaging</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<modules>

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<developers>

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<developers>

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<developers>

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<developers>

View File

@@ -0,0 +1,54 @@
package org.alfresco.rest.models.aspects;
import org.alfresco.rest.RestTest;
import org.alfresco.rest.model.RestAspectModel;
import org.alfresco.rest.model.RestErrorModel;
import org.alfresco.utility.model.TestGroup;
import org.alfresco.utility.testrail.ExecutionType;
import org.alfresco.utility.testrail.annotation.TestRail;
import org.springframework.http.HttpStatus;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Test;
public class GetAspectTests extends RestTest
{
@BeforeClass(alwaysRun=true)
public void dataPreparation() throws Exception
{
restClient.authenticateUser(dataUser.createRandomTestUser());
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = { TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Verify inexistent aspect and status code is Not Found (404)")
public void getInexistentAspect() throws Exception
{
String unknownAspect = "unknown:aspect";
restClient.withModelAPI().getAspect(unknownAspect);
restClient.assertStatusCodeIs(HttpStatus.NOT_FOUND)
.assertLastError().containsSummary(String.format(RestErrorModel.ENTITY_WAS_NOT_FOUND, unknownAspect));
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = { TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Verify Aspect Info and status code is OK (200)")
public void getAspect() throws Exception
{
RestAspectModel aspect = restClient.withModelAPI().getAspect("cm:titled");
restClient.assertStatusCodeIs(HttpStatus.OK);
aspect.assertThat().field("associations").isEmpty().and()
.field("mandatoryAspects").isEmpty().and()
.field("properties").isNotEmpty().and()
.field("includedInSupertypeQuery").is(true).and()
.field("isContainer").is(false).and()
.field("id").is("cm:titled").and()
.field("description").is("Titled").and()
.field("title").is("Titled").and()
.field("model.id").is("cm:contentmodel").and()
.field("model.author").is("Alfresco").and()
.field("model.description").is("Alfresco Content Domain Model").and()
.field("model.namespaceUri").is("http://www.alfresco.org/model/content/1.0").and()
.field("model.namespacePrefix").is("cm");
}
}

View File

@@ -0,0 +1,199 @@
package org.alfresco.rest.models.aspects;
import org.alfresco.rest.RestTest;
import org.alfresco.rest.model.RestAbstractClassModel;
import org.alfresco.rest.model.RestAspectsCollection;
import org.alfresco.utility.model.TestGroup;
import org.alfresco.utility.model.UserModel;
import org.alfresco.utility.testrail.ExecutionType;
import org.alfresco.utility.testrail.annotation.TestRail;
import org.springframework.http.HttpStatus;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Test;
public class GetAspectsTests extends RestTest
{
private UserModel regularUser;
@BeforeClass(alwaysRun=true)
public void dataPreparation() throws Exception
{
regularUser = dataUser.createRandomTestUser();
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Verify user get aspects and gets status code OK (200)")
public void getAspects() throws Exception
{
RestAspectsCollection aspects = restClient.authenticateUser(regularUser).withModelAPI()
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.OK);
aspects.assertThat()
.entriesListCountIs(100)
.and().entriesListContains("id", "cm:classifiable")
.and().entriesListContains("id", "cm:author")
.and().entriesListContains("id", "cm:checkedOut");
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should filter aspects using namespace uri and gets status code OK (200)")
public void getAspectByNamespaceUri() throws Exception
{
RestAspectsCollection aspects = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(namespaceUri matches('http://www.alfresco.org/model.*'))")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.OK);
aspects.assertThat().entriesListCountIs(100);
aspects = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(not namespaceUri matches('http://www.alfresco.org/model.*'))")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.OK);
aspects.assertThat().entriesListCountIs(0);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should filter aspects using modelId and gets status code OK (200)")
public void getAspectByModelsIds() throws Exception
{
RestAspectsCollection aspects = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in ('cm:contentmodel', 'smf:smartFolder'))")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.OK);
aspects.getPagination().assertThat().fieldsCount().is(5).and()
.field("totalItems").isLessThan(65).and()
.field("maxItems").is(100).and()
.field("skipCount").isGreaterThan(0).and()
.field("hasMoreItems").is(false);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should filter aspects using modelId with subaspects and gets status code OK (200)")
public void getAspectByModelsIdsWithIncludeSubAspects() throws Exception
{
RestAspectsCollection aspects = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in ('cm:contentmodel INCLUDESUBASPECTS', 'smf:smartFolder INCLUDESUBASPECTS'))")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.OK);
aspects.getPagination().assertThat().fieldsCount().is(5).and()
.field("totalItems").isGreaterThan(65).and()
.field("maxItems").is(100).and()
.field("skipCount").isGreaterThan(0).and()
.field("hasMoreItems").is(false);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should filter aspects using parentId and gets status code OK (200)")
public void getAspectByParentId() throws Exception
{
RestAspectsCollection aspects = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in ('cm:titled'))")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.OK);
aspects.getPagination().assertThat().fieldsCount().is(5).and()
.field("totalItems").is(5).and()
.field("hasMoreItems").is(false);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should Aspects association, properties and mandatory aspects and gets status code OK (200)")
public void getAspectIncludeParams() throws Exception
{
RestAspectsCollection aspects = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("include=properties,mandatoryAspects,associations")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.OK);
for (RestAbstractClassModel aspect : aspects.getEntries())
{
aspect.onModel().assertThat()
.field("associations").isNotNull().and()
.field("properties").isNotNull().and()
.field("mandatoryAspects").isNotNull();
}
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should verify the query errors with possible options")
public void verifyAspectsQueryError()
{
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in (' ')")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in ('cm:contentmodel INCLUDESUBASPECTS',))")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in ('cm:contentmodel INCLUDESUBTYPES'))")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in (' ')")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in ('cm:content',))")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in ('cm:content',))&include=properties")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(namespaceUri matches('*'))")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in ('cm:content'))&include=properties")
.getAspects();
restClient.assertStatusCodeIs(HttpStatus.OK);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section={TestGroup.REST_API, TestGroup.MODEL}, executionType= ExecutionType.REGRESSION,
description= "Verify if any user gets aspects with high skipCount and maxItems parameter applied")
public void getPaginationParameter() throws Exception
{
RestAspectsCollection aspects = restClient.authenticateUser(regularUser)
.withModelAPI()
.usingParams("maxItems=10&skipCount=10")
.getAspects();
aspects.assertThat().entriesListCountIs(10);
aspects.assertThat().paginationField("hasMoreItems").is("true");
aspects.assertThat().paginationField("skipCount").is("10");
aspects.assertThat().paginationField("maxItems").is("10");
restClient.assertStatusCodeIs(HttpStatus.OK);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section={TestGroup.REST_API, TestGroup.MODEL}, executionType= ExecutionType.REGRESSION,
description= "Verify if any user gets aspects with hasMoreItems applied bases on skip count and maxItems")
public void getHighPaginationQuery() throws Exception
{
RestAspectsCollection aspects = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("maxItems=10&skipCount=150")
.getAspects();
aspects.assertThat().entriesListCountIs(0);
aspects.assertThat().paginationField("hasMoreItems").is("false");
aspects.assertThat().paginationField("skipCount").is("150");
aspects.assertThat().paginationField("maxItems").is("10");
restClient.assertStatusCodeIs(HttpStatus.OK);
}
}

View File

@@ -0,0 +1,55 @@
package org.alfresco.rest.models.types;
import org.alfresco.rest.RestTest;
import org.alfresco.rest.model.RestErrorModel;
import org.alfresco.rest.model.RestTypeModel;
import org.alfresco.utility.model.TestGroup;
import org.alfresco.utility.testrail.ExecutionType;
import org.alfresco.utility.testrail.annotation.TestRail;
import org.springframework.http.HttpStatus;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Test;
public class GetTypeTests extends RestTest
{
@BeforeClass(alwaysRun=true)
public void dataPreparation() throws Exception
{
restClient.authenticateUser(dataUser.createRandomTestUser());
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = { TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Verify inexistent type and status code is Not Found (404)")
public void getInexistentType() throws Exception
{
String unknownType = "unknown:type";
restClient.withModelAPI().getType(unknownType);
restClient.assertStatusCodeIs(HttpStatus.NOT_FOUND)
.assertLastError().containsSummary(String.format(RestErrorModel.ENTITY_WAS_NOT_FOUND, unknownType));
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = { TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Verify Type Info and status code is OK (200)")
public void getType() throws Exception
{
RestTypeModel type = restClient.withModelAPI().getType("cm:content");
restClient.assertStatusCodeIs(HttpStatus.OK);
type.assertThat().field("associations").isEmpty().and()
.field("mandatoryAspects").isNotEmpty().and()
.field("properties").isNotEmpty().and()
.field("includedInSupertypeQuery").is(true).and()
.field("isArchive").is(true).and()
.field("isContainer").is(false).and()
.field("id").is("cm:content").and()
.field("description").is("Base Content Object").and()
.field("title").is("Content").and()
.field("model.id").is("cm:contentmodel").and()
.field("model.author").is("Alfresco").and()
.field("model.description").is("Alfresco Content Domain Model").and()
.field("model.namespaceUri").is("http://www.alfresco.org/model/content/1.0").and()
.field("model.namespacePrefix").is("cm");
}
}

View File

@@ -0,0 +1,199 @@
package org.alfresco.rest.models.types;
import org.alfresco.rest.RestTest;
import org.alfresco.rest.model.RestAbstractClassModel;
import org.alfresco.rest.model.RestTypesCollection;
import org.alfresco.utility.model.TestGroup;
import org.alfresco.utility.model.UserModel;
import org.alfresco.utility.testrail.ExecutionType;
import org.alfresco.utility.testrail.annotation.TestRail;
import org.springframework.http.HttpStatus;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Test;
public class GetTypesTests extends RestTest
{
private UserModel regularUser;
@BeforeClass(alwaysRun=true)
public void dataPreparation() throws Exception
{
regularUser = dataUser.createRandomTestUser();
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Verify user get types and gets status code OK (200)")
public void getTypes() throws Exception
{
RestTypesCollection types = restClient.authenticateUser(regularUser).withModelAPI()
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.OK);
types.assertThat()
.entriesListCountIs(100)
.and().entriesListContains("id", "cm:content")
.and().entriesListContains("id", "cm:systemfolder")
.and().entriesListContains("id", "cm:folder");
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should filter types using namespace uri and gets status code OK (200)")
public void getTypeByNamespaceUri() throws Exception
{
RestTypesCollection types = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(namespaceUri matches('http://www.alfresco.org/model.*'))")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.OK);
types.assertThat().entriesListCountIs(100);
types = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(not namespaceUri matches('http://www.alfresco.org/model.*'))")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.OK);
types.assertThat().entriesListCountIs(0);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should filter types using modelId and gets status code OK (200)")
public void getTypeByModelsIds() throws Exception
{
RestTypesCollection types = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in ('cm:contentmodel', 'smf:smartFolder'))")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.OK);
types.getPagination().assertThat().fieldsCount().is(5).and()
.field("totalItems").isLessThan(65).and()
.field("maxItems").is(100).and()
.field("skipCount").isGreaterThan(0).and()
.field("hasMoreItems").is(false);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should filter types using modelId with subtypes and gets status code OK (200)")
public void getTypeByModelsIdsWithIncludeSubTypes() throws Exception
{
RestTypesCollection types = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in ('cm:contentmodel INCLUDESUBTYPES', 'smf:smartFolder INCLUDESUBTYPES'))")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.OK);
types.getPagination().assertThat().fieldsCount().is(5).and()
.field("totalItems").isGreaterThan(65).and()
.field("maxItems").is(100).and()
.field("skipCount").isGreaterThan(0).and()
.field("hasMoreItems").is(false);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should filter types using parentId and gets status code OK (200)")
public void getTypeByParentId() throws Exception
{
RestTypesCollection types = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in ('cm:content'))")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.OK);
types.getPagination().assertThat().fieldsCount().is(5).and()
.field("totalItems").isGreaterThan(40).and()
.field("hasMoreItems").is(false);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should get Type with association, properties and mandatory types and gets status code OK (200)")
public void getTypeIncludeParams() throws Exception
{
RestTypesCollection types = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("include=properties,mandatoryAspects,associations")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.OK);
for (RestAbstractClassModel type : types.getEntries())
{
type.onModel().assertThat()
.field("associations").isNotNull().and()
.field("properties").isNotNull().and()
.field("mandatoryAspects").isNotNull();
}
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section = {TestGroup.REST_API, TestGroup.MODEL }, executionType = ExecutionType.REGRESSION,
description = "Should verify the query errors with possible options")
public void verifyTypesQueryError() throws Exception
{
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in (' ')")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in ('cm:contentmodel INCLUDESUBTYPES',))")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(modelId in ('cm:contentmodel INCLUDESUBASPECTS'))")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in (' ')")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in ('cm:titled',))")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in ('cm:titled',))&include=properties")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(namespaceUri matches('*'))")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.BAD_REQUEST);
restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("where=(parentId in ('cm:titled'))&include=properties")
.getTypes();
restClient.assertStatusCodeIs(HttpStatus.OK);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section={TestGroup.REST_API, TestGroup.MODEL}, executionType= ExecutionType.REGRESSION,
description= "Verify if any user gets types with high skipCount and maxItems parameter applied")
public void getPaginationParameter() throws Exception
{
RestTypesCollection types = restClient.authenticateUser(regularUser)
.withModelAPI()
.usingParams("maxItems=10&skipCount=10")
.getTypes();
types.assertThat().entriesListCountIs(10);
types.assertThat().paginationField("hasMoreItems").is("true");
types.assertThat().paginationField("skipCount").is("10");
types.assertThat().paginationField("maxItems").is("10");
restClient.assertStatusCodeIs(HttpStatus.OK);
}
@Test(groups = { TestGroup.REST_API, TestGroup.MODEL, TestGroup.REGRESSION })
@TestRail(section={TestGroup.REST_API, TestGroup.MODEL}, executionType= ExecutionType.REGRESSION,
description= "Verify if any user gets types with hasMoreItems applied bases on skip count and maxItems")
public void getHighPaginationQuery() throws Exception
{
RestTypesCollection types = restClient.authenticateUser(regularUser).withModelAPI()
.usingParams("maxItems=10&skipCount=150")
.getTypes();
types.assertThat().entriesListCountIs(0);
types.assertThat().paginationField("hasMoreItems").is("false");
types.assertThat().paginationField("skipCount").is("150");
types.assertThat().paginationField("maxItems").is("10");
restClient.assertStatusCodeIs(HttpStatus.OK);
}
}

View File

@@ -0,0 +1,105 @@
/*
* Copyright (C) 2005-2017 Alfresco Software Limited.
* This file is part of Alfresco
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.rest.renditions;
/**
* Handles tests related to api-explorer/#!/versions/createVersionRendition
*/
import org.alfresco.dataprep.CMISUtil.DocumentType;
import org.alfresco.rest.RestTest;
import org.alfresco.rest.core.RestResponse;
import org.alfresco.rest.model.RestRenditionInfoModel;
import org.alfresco.rest.model.RestRenditionInfoModelCollection;
import org.alfresco.utility.Utility;
import org.alfresco.utility.model.FileModel;
import org.alfresco.utility.model.SiteModel;
import org.alfresco.utility.model.TestGroup;
import org.alfresco.utility.model.UserModel;
import org.alfresco.utility.testrail.ExecutionType;
import org.alfresco.utility.testrail.annotation.TestRail;
import org.springframework.http.HttpStatus;
import org.testng.Assert;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Test;
import java.io.File;
import static org.alfresco.utility.report.log.Step.STEP;
@Test(groups = { TestGroup.RENDITIONS }) public class RenditionVersionTests extends RestTest
{
private UserModel user;
private SiteModel site;
private FileModel file;
@BeforeClass(alwaysRun = true) public void dataPreparation() throws Exception
{
user = dataUser.createRandomTestUser();
site = dataSite.usingUser(user).createPublicRandomSite();
file = dataContent.usingUser(user).usingSite(site).createContent(DocumentType.TEXT_PLAIN);
}
/**
* Sanity test for the following endpoints:
* POST /nodes/{nodeId}/versions/{versionId}/rendition
* GET /nodes/{nodeId}/versions/{versionId}/renditions
* GET /nodes/{nodeId}/versions/{versionId}/renditions/{renditionId}
* GET /nodes/{nodeId}/versions/{versionId}/renditions/{renditionId}/content
* @throws Exception
*/
@Test(groups = { TestGroup.REST_API, TestGroup.RENDITIONS, TestGroup.SANITY }) @TestRail(section = {
TestGroup.REST_API,
TestGroup.RENDITIONS }, executionType = ExecutionType.SANITY, description = "Verify that the rendition can be created using POST /nodes/{nodeId}/versions/{versionId}/rendition") public void testRenditionForNodeVersions()
throws Exception
{
File sampleFile = Utility.getResourceTestDataFile("sampleContent.txt");
STEP("1. Update the node content in order to increase version, PUT /nodes/{nodeId}/content.");
// version update
restClient.authenticateUser(user).withCoreAPI().usingNode(file).updateNodeContent(sampleFile);
restClient.assertStatusCodeIs(HttpStatus.OK);
STEP("2. Create the pdf rendition of txt file using RESTAPI");
restClient.withCoreAPI().usingNode(file).createNodeVersionRendition("pdf", "1.1");
restClient.assertStatusCodeIs(HttpStatus.ACCEPTED);
STEP("3. Verify pdf rendition of txt file is created");
restClient.withCoreAPI().usingNode(file).getNodeVersionRenditionUntilIsCreated("pdf", "1.1").assertThat()
.field("status").is("CREATED");
STEP("4. Verify pdf rendition of txt file is listed");
RestRenditionInfoModelCollection renditionInfoModelCollection = restClient.withCoreAPI().usingNode(file)
.getNodeVersionRenditionsInfo("1.1");
restClient.assertStatusCodeIs(HttpStatus.OK);
for (RestRenditionInfoModel restRenditionInfoModel : renditionInfoModelCollection.getEntries())
{
RestRenditionInfoModel renditionInfo = restRenditionInfoModel.onModel();
String renditionId = renditionInfo.getId();
if (renditionId == "pdf")
{
renditionInfo.assertThat().field("status").is("CREATED");
}
}
STEP("5. Verify pdf rendition of txt file has content");
RestResponse restResponse = restClient.withCoreAPI().usingNode(file)
.getNodeVersionRenditionContentUntilIsCreated("pdf", "1.1");
restClient.assertStatusCodeIs(HttpStatus.OK);
restClient.assertHeaderValueContains("Content-Type", "application/pdf;charset=UTF-8");
Assert.assertTrue(restResponse.getResponse().body().asInputStream().available() > 0);
}
}

View File

@@ -15,6 +15,7 @@
<package name="org.alfresco.rest.tags.*"/>
<package name="org.alfresco.rest.trashcan.*"/>
<package name="org.alfresco.rest.workflow.*"/>
<package name="org.alfresco.rest.models.*"/>
</packages>
</test>
</suite>

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<developers>

View File

@@ -7,12 +7,12 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-packaging</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<properties>
<scm-path>${project.parent.parent.scm.url}</scm-path>
<scm-revision>${build-number}</scm-revision>
<scm-revision>${buildNumber}</scm-revision>
</properties>
<dependencies>
@@ -140,6 +140,23 @@
</resource>
</resources>
<plugins>
<!-- Gets the scm revision and stores it in the ${buildNumber} variable -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>create</goal>
</goals>
</execution>
</executions>
<configuration>
<shortRevisionLength>8</shortRevisionLength>
</configuration>
</plugin>
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<executions>

View File

@@ -66,6 +66,8 @@
<bean id="SOLRAuthenticationFilter" class="org.alfresco.repo.web.scripts.solr.SOLRAuthenticationFilter">
<property name="secureComms" value="${solr.secureComms}"/>
<property name="sharedSecret" value="${solr.sharedSecret}"/>
<property name="sharedSecretHeader" value="${solr.sharedSecret.header}"/>
</bean>
<bean id="WebscriptAuthenticationFilter" class="org.alfresco.repo.management.subsystems.ChainingSubsystemProxyFactory">

View File

@@ -184,5 +184,15 @@
</filter>
</config>
<!--
A set of HTTP response headers that instructs the browser to behave in certain ways to improve security
-->
<config evaluator="string-compare" condition="SecurityHeadersPolicy">
<headers>
<header>
<name>X-Frame-Options</name>
<value>SAMEORIGIN</value>
</header>
</headers>
</config>
</alfresco-config>

View File

@@ -104,6 +104,12 @@
<filter-class>org.springframework.extensions.webscripts.servlet.CSRFFilter</filter-class>
</filter>
<filter>
<description>Security Headers filter. Adds security response headers based on config.</description>
<filter-name>Security Headers Filter</filter-name>
<filter-class>org.springframework.extensions.webscripts.servlet.SecurityHeadersFilter</filter-class>
</filter>
<!-- Enterprise filter placeholder -->
<filter-mapping>
<filter-name>Clear security context filter</filter-name>
@@ -225,6 +231,11 @@
<url-pattern>/wcs/admin/*</url-pattern>
</filter-mapping>
<filter-mapping>
<filter-name>Security Headers Filter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
<!-- Enterprise filter-mapping placeholder -->
<!-- Spring Context Loader listener - can disable loading of context if runtime config changes are needed -->

45
pom.xml
View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>alfresco-community-repo</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
<packaging>pom</packaging>
<name>Alfresco Community Repo Parent</name>
@@ -22,7 +22,7 @@
<properties>
<acs.version.major>7</acs.version.major>
<acs.version.minor>0</acs.version.minor>
<acs.version.minor>1</acs.version.minor>
<acs.version.revision>0</acs.version.revision>
<acs.version.label />
@@ -49,20 +49,20 @@
<dependency.alfresco-log-sanitizer.version>0.2</dependency.alfresco-log-sanitizer.version>
<dependency.activiti-engine.version>5.23.0</dependency.activiti-engine.version>
<dependency.activiti.version>5.23.0</dependency.activiti.version>
<dependency.transform.model.version>1.0.2.12</dependency.transform.model.version>
<dependency.transform.model.version>1.3.1</dependency.transform.model.version>
<dependency.alfresco-greenmail.version>6.2</dependency.alfresco-greenmail.version>
<dependency.acs-event-model.version>0.0.11</dependency.acs-event-model.version>
<dependency.acs-event-model.version>0.0.12</dependency.acs-event-model.version>
<dependency.spring.version>5.3.3</dependency.spring.version>
<dependency.antlr.version>3.5.2</dependency.antlr.version>
<dependency.jackson.version>2.12.1</dependency.jackson.version>
<dependency.jackson.version>2.12.2</dependency.jackson.version>
<dependency.jackson-databind.version>${dependency.jackson.version}</dependency.jackson-databind.version>
<dependency.cxf.version>3.4.2</dependency.cxf.version>
<dependency.cxf.version>3.4.3</dependency.cxf.version>
<dependency.opencmis.version>1.0.0</dependency.opencmis.version>
<dependency.webscripts.version>8.15</dependency.webscripts.version>
<dependency.webscripts.version>8.18</dependency.webscripts.version>
<dependency.bouncycastle.version>1.68</dependency.bouncycastle.version>
<dependency.mockito-core.version>3.7.7</dependency.mockito-core.version>
<dependency.org-json.version>20201115</dependency.org-json.version>
<dependency.mockito-core.version>3.9.0</dependency.mockito-core.version>
<dependency.org-json.version>20210307</dependency.org-json.version>
<dependency.commons-dbcp.version>1.4-DBCP330</dependency.commons-dbcp.version>
<dependency.commons-io.version>2.8.0</dependency.commons-io.version>
<dependency.gson.version>2.8.5</dependency.gson.version>
@@ -73,14 +73,14 @@
<dependency.slf4j.version>1.7.30</dependency.slf4j.version>
<dependency.gytheio.version>0.12</dependency.gytheio.version>
<dependency.groovy.version>2.5.9</dependency.groovy.version>
<dependency.tika.version>1.25</dependency.tika.version>
<dependency.tika.version>1.26</dependency.tika.version>
<dependency.spring-security.version>5.4.1</dependency.spring-security.version>
<dependency.truezip.version>7.7.10</dependency.truezip.version>
<dependency.poi.version>4.1.2</dependency.poi.version>
<dependency.ooxml-schemas.version>1.4</dependency.ooxml-schemas.version>
<dependency.keycloak.version>11.0.0-alfresco-001</dependency.keycloak.version>
<dependency.jboss.logging.version>3.4.1.Final</dependency.jboss.logging.version>
<dependency.camel.version>3.7.0</dependency.camel.version>
<dependency.camel.version>3.7.1</dependency.camel.version>
<dependency.activemq.version>5.16.1</dependency.activemq.version>
<dependency.apache.taglibs.version>1.2.5</dependency.apache.taglibs.version>
<dependency.awaitility.version>4.0.3</dependency.awaitility.version>
@@ -96,16 +96,16 @@
<dependency.jakarta-json-api.version>1.1.6</dependency.jakarta-json-api.version>
<dependency.jakarta-rpc-api.version>1.1.4</dependency.jakarta-rpc-api.version>
<alfresco.googledrive.version>3.2.0</alfresco.googledrive.version>
<alfresco.aos-module.version>1.4.0-M1</alfresco.aos-module.version>
<alfresco.googledrive.version>3.2.1</alfresco.googledrive.version>
<alfresco.aos-module.version>1.4.0</alfresco.aos-module.version>
<dependency.postgresql.version>42.2.18</dependency.postgresql.version>
<dependency.postgresql.version>42.2.19</dependency.postgresql.version>
<dependency.mysql.version>8.0.23</dependency.mysql.version>
<dependency.mariadb.version>2.7.2</dependency.mariadb.version>
<dependency.tas-utility.version>3.0.42</dependency.tas-utility.version>
<dependency.tas-utility.version>3.0.44</dependency.tas-utility.version>
<dependency.rest-assured.version>3.3.0</dependency.rest-assured.version>
<dependency.tas-restapi.version>1.52</dependency.tas-restapi.version>
<dependency.tas-cmis.version>1.27</dependency.tas-cmis.version>
<dependency.tas-restapi.version>1.58</dependency.tas-restapi.version>
<dependency.tas-cmis.version>1.29</dependency.tas-cmis.version>
<dependency.tas-email.version>1.8</dependency.tas-email.version>
<dependency.tas-webdav.version>1.6</dependency.tas-webdav.version>
<dependency.tas-ftp.version>1.5</dependency.tas-ftp.version>
@@ -116,7 +116,7 @@
<connection>scm:git:https://github.com/Alfresco/alfresco-community-repo.git</connection>
<developerConnection>scm:git:https://github.com/Alfresco/alfresco-community-repo.git</developerConnection>
<url>https://github.com/Alfresco/alfresco-community-repo</url>
<tag>repo-5439v2-c1</tag>
<tag>11.13</tag>
</scm>
<distributionManagement>
@@ -679,7 +679,7 @@
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.10.9</version>
<version>2.10.10</version>
</dependency>
<!-- provided dependencies -->
@@ -694,7 +694,7 @@
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13</version>
<version>4.13.1</version>
<scope>test</scope>
</dependency>
<dependency>
@@ -777,6 +777,11 @@
<artifactId>camel-direct</artifactId>
<version>${dependency.camel.version}</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-management</artifactId>
<version>${dependency.camel.version}</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-mock</artifactId>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<dependencies>

View File

@@ -1,62 +1,61 @@
/*
* #%L
* Alfresco Remote API
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
/*
* #%L
* Alfresco Remote API
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.web.scripts.solr;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.PrintWriter;
import javax.servlet.FilterChain;
import javax.servlet.ServletContext;
import javax.servlet.ServletException;
import javax.servlet.ServletOutputStream;
import javax.servlet.ServletRequest;
import javax.servlet.ServletResponse;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.http.HttpServletResponseWrapper;
import org.alfresco.error.AlfrescoRuntimeException;
import org.alfresco.httpclient.HttpClientFactory;
import org.alfresco.repo.web.filter.beans.DependencyInjectedFilter;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.InitializingBean;
/**
* This filter protects the solr callback urls by verifying MACs on requests and encrypting responses
* and generating MACs on responses, if the secureComms property is set to "md5". If it is set to "https"
* or "none", the filter does nothing to the request and response.
*
* This filter protects the solr callback urls by verifying a shared secret on the request header if
* the secureComms property is set to "secret". If it is set to "https", this will will just verify
* that the request came in through a "secure" tomcat connector. (but it will not validate the certificate
* on the request; this done in a different filter).
*
* @since 4.0
*
*/
public class SOLRAuthenticationFilter implements DependencyInjectedFilter
public class SOLRAuthenticationFilter implements DependencyInjectedFilter, InitializingBean
{
public static enum SecureCommsType
{
HTTPS, NONE;
HTTPS, SECRET, NONE;
public static SecureCommsType getType(String type)
{
@@ -64,6 +63,10 @@ public class SOLRAuthenticationFilter implements DependencyInjectedFilter
{
return HTTPS;
}
else if(type.equalsIgnoreCase("secret"))
{
return SECRET;
}
else if(type.equalsIgnoreCase("none"))
{
return NONE;
@@ -79,7 +82,11 @@ public class SOLRAuthenticationFilter implements DependencyInjectedFilter
private static Log logger = LogFactory.getLog(SOLRAuthenticationFilter.class);
private SecureCommsType secureComms = SecureCommsType.HTTPS;
private String sharedSecret;
private String sharedSecretHeader = HttpClientFactory.DEFAULT_SHAREDSECRET_HEADER;
public void setSecureComms(String type)
{
try
@@ -92,6 +99,33 @@ public class SOLRAuthenticationFilter implements DependencyInjectedFilter
}
}
public void setSharedSecret(String sharedSecret)
{
this.sharedSecret = sharedSecret;
}
public void setSharedSecretHeader(String sharedSecretHeader)
{
this.sharedSecretHeader = sharedSecretHeader;
}
@Override
public void afterPropertiesSet() throws Exception
{
if(secureComms == SecureCommsType.SECRET)
{
if(sharedSecret == null || sharedSecret.length()==0)
{
logger.fatal("Missing value for solr.sharedSecret configuration property. If solr.secureComms is set to \"secret\", a value for solr.sharedSecret is required. See https://docs.alfresco.com/search-services/latest/install/options/");
throw new AlfrescoRuntimeException("Missing value for solr.sharedSecret configuration property");
}
if(sharedSecretHeader == null || sharedSecretHeader.length()==0)
{
throw new AlfrescoRuntimeException("Missing value for sharedSecretHeader");
}
}
}
public void doFilter(ServletContext context, ServletRequest request,
ServletResponse response, FilterChain chain) throws IOException,
ServletException
@@ -99,52 +133,22 @@ public class SOLRAuthenticationFilter implements DependencyInjectedFilter
HttpServletRequest httpRequest = (HttpServletRequest)request;
HttpServletResponse httpResponse = (HttpServletResponse)response;
/* if(secureComms == SecureCommsType.ALFRESCO)
if(secureComms == SecureCommsType.SECRET)
{
// Need to get as a byte array because we need to read the request twice, once for authentication
// and again by the web service.
SOLRHttpServletRequestWrapper requestWrapper = new SOLRHttpServletRequestWrapper(httpRequest, encryptionUtils);
if(logger.isDebugEnabled())
if(sharedSecret.equals(httpRequest.getHeader(sharedSecretHeader)))
{
logger.debug("Authenticating " + httpRequest.getRequestURI());
}
if(encryptionUtils.authenticate(httpRequest, requestWrapper.getDecryptedBody()))
{
try
{
OutputStream out = response.getOutputStream();
GenericResponseWrapper responseWrapper = new GenericResponseWrapper(httpResponse);
// TODO - do I need to chain to other authenticating filters - probably not?
// Could also remove sending of credentials with http request
chain.doFilter(requestWrapper, responseWrapper);
Pair<byte[], AlgorithmParameters> pair = encryptor.encrypt(KeyProvider.ALIAS_SOLR, null, responseWrapper.getData());
encryptionUtils.setResponseAuthentication(httpRequest, httpResponse, responseWrapper.getData(), pair.getSecond());
httpResponse.setHeader("Content-Length", Long.toString(pair.getFirst().length));
out.write(pair.getFirst());
out.close();
}
catch(Exception e)
{
throw new AlfrescoRuntimeException("", e);
}
chain.doFilter(request, response);
}
else
{
httpResponse.setStatus(401);
httpResponse.sendError(HttpServletResponse.SC_FORBIDDEN, "Authentication failure");
}
}
else */if(secureComms == SecureCommsType.HTTPS)
else if(secureComms == SecureCommsType.HTTPS)
{
if(httpRequest.isSecure())
{
// https authentication
// https authentication; cert got verified in X509 filter
chain.doFilter(request, response);
}
else
@@ -158,128 +162,4 @@ public class SOLRAuthenticationFilter implements DependencyInjectedFilter
}
}
protected boolean validateTimestamp(String timestampStr)
{
if(timestampStr == null || timestampStr.equals(""))
{
throw new AlfrescoRuntimeException("Missing timestamp on request");
}
long timestamp = -1;
try
{
timestamp = Long.valueOf(timestampStr);
}
catch(NumberFormatException e)
{
throw new AlfrescoRuntimeException("Invalid timestamp on request");
}
if(timestamp == -1)
{
throw new AlfrescoRuntimeException("Invalid timestamp on request");
}
long currentTime = System.currentTimeMillis();
return((currentTime - timestamp) < 30 * 1000); // 5s
}
/* private static class SOLRHttpServletRequestWrapper extends HttpServletRequestWrapper
{
private byte[] body;
SOLRHttpServletRequestWrapper(HttpServletRequest req, EncryptionUtils encryptionUtils) throws IOException
{
super(req);
this.body = encryptionUtils.decryptBody(req);
}
byte[] getDecryptedBody()
{
return body;
}
public ServletInputStream getInputStream()
{
final InputStream in = (body != null ? new ByteArrayInputStream(body) : null);
return new ServletInputStream()
{
public int read() throws IOException
{
if(in == null)
{
return -1;
}
else
{
int i = in.read();
if(i == -1)
{
in.close();
}
return i;
}
}
};
}
}*/
private static class ByteArrayServletOutputStream extends ServletOutputStream
{
private ByteArrayOutputStream out = new ByteArrayOutputStream();
ByteArrayServletOutputStream()
{
}
public byte[] getData()
{
return out.toByteArray();
}
@Override
public void write(int b) throws IOException
{
out.write(b);
}
}
public static class GenericResponseWrapper extends HttpServletResponseWrapper {
private ByteArrayServletOutputStream output;
private int contentLength;
private String contentType;
public GenericResponseWrapper(HttpServletResponse response) {
super(response);
output = new ByteArrayServletOutputStream();
}
public byte[] getData() {
return output.getData();
}
public ServletOutputStream getOutputStream() {
return output;
}
public PrintWriter getWriter() {
return new PrintWriter(getOutputStream(),true);
}
public void setContentLength(int length) {
this.contentLength = length;
super.setContentLength(length);
}
public int getContentLength() {
return contentLength;
}
public void setContentType(String type) {
this.contentType = type;
super.setContentType(type);
}
public String getContentType() {
return contentType;
}
}
}

View File

@@ -30,8 +30,22 @@ import org.alfresco.rest.api.model.Aspect;
import org.alfresco.rest.framework.resource.parameters.CollectionWithPagingInfo;
import org.alfresco.rest.framework.resource.parameters.Parameters;
/**
* Aspect API
*/
public interface Aspects
{
/**
* Lists aspects
* @param params
* @return Collection of aspects
*/
CollectionWithPagingInfo<Aspect> listAspects(Parameters params);
Aspect getAspectById(String aspectId);
/**
* Gets an aspect by id
* @param aspectId
* @return an aspect
*/
Aspect getAspect(String aspectId);
}

View File

@@ -30,8 +30,24 @@ import org.alfresco.rest.api.model.Type;
import org.alfresco.rest.framework.resource.parameters.CollectionWithPagingInfo;
import org.alfresco.rest.framework.resource.parameters.Parameters;
/**
* Types API
*/
public interface Types
{
/**
* Lists types
*
* @param params
* @return Collection of types
*/
CollectionWithPagingInfo<Type> listTypes(Parameters params);
Type getType(String aspectId);
/**
* Gets a type by id
*
* @param typeId
* @return type
*/
Type getType(String typeId);
}

View File

@@ -61,6 +61,6 @@ public class AspectEntityResource implements EntityResourceAction.ReadById<Aspec
@Override
public Aspect readById(String id, Parameters parameters)
{
return aspects.getAspectById(id);
return aspects.getAspect(id);
}
}

View File

@@ -26,15 +26,25 @@
package org.alfresco.rest.api.impl;
import com.google.common.collect.ImmutableList;
import org.alfresco.rest.api.ClassDefinitionMapper;
import org.alfresco.rest.api.model.AssociationSource;
import org.alfresco.rest.api.model.Association;
import org.alfresco.rest.api.model.AbstractClass;
import org.alfresco.rest.api.model.PropertyDefinition;
import org.alfresco.rest.api.model.ClassDefinition;
import org.alfresco.rest.framework.core.exceptions.InvalidArgumentException;
import org.alfresco.rest.framework.resource.parameters.CollectionWithPagingInfo;
import org.alfresco.rest.framework.resource.parameters.Paging;
import org.alfresco.rest.framework.resource.parameters.where.Query;
import org.alfresco.rest.framework.resource.parameters.where.QueryHelper;
import org.alfresco.rest.workflow.api.impl.MapBasedQueryWalker;
import org.alfresco.service.cmr.dictionary.AssociationDefinition;
import org.alfresco.service.cmr.dictionary.DictionaryService;
import org.alfresco.service.namespace.NamespacePrefixResolver;
import org.alfresco.service.namespace.NamespaceService;
import org.alfresco.service.namespace.QName;
import org.alfresco.util.Pair;
import org.apache.commons.lang3.StringUtils;
import java.util.Arrays;
@@ -42,12 +52,36 @@ import java.util.List;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import java.util.Map;
import java.util.Collection;
import java.util.ArrayList;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import java.util.concurrent.ConcurrentHashMap;
import java.util.function.Function;
import java.util.function.Predicate;
public class AbstractClassImpl<T extends AbstractClass> {
static String PARAM_MODEL_IDS = "modelIds";
static String PARAM_PARENT_IDS = "parentIds";
static String PARAM_MODEL_IDS = "modelId";
static String PARAM_PARENT_IDS = "parentId";
static String PARAM_NAMESPACE_URI = "namespaceUri";
static String PARAM_INCLUDE_SUBASPECTS = "INCLUDESUBASPECTS";
static String PARAM_INCLUDE_SUBTYPES = "INCLUDESUBTYPES";
static String PARAM_INCLUDE_PROPERTIES = "properties";
static String PARAM_INCLUDE_MANDATORY_ASPECTS = "mandatoryAspects";
static String PARAM_INCLUDE_ASSOCIATIONS = "associations";
static List<String> ALL_PROPERTIES = ImmutableList.of(PARAM_INCLUDE_PROPERTIES, PARAM_INCLUDE_MANDATORY_ASPECTS, PARAM_INCLUDE_ASSOCIATIONS);
private DictionaryService dictionaryService;
private NamespacePrefixResolver namespaceService;
private ClassDefinitionMapper classDefinitionMapper;
AbstractClassImpl(DictionaryService dictionaryService, NamespacePrefixResolver namespaceService, ClassDefinitionMapper classDefinitionMapper)
{
this.dictionaryService = dictionaryService;
this.namespaceService = namespaceService;
this.classDefinitionMapper = classDefinitionMapper;
}
public CollectionWithPagingInfo<T> createPagedResult(List<T> list, Paging paging)
{
@@ -95,7 +129,13 @@ public class AbstractClassImpl<T extends AbstractClass> {
{
ClassQueryWalker propertyWalker = new ClassQueryWalker();
QueryHelper.walk(queryParameters, propertyWalker);
return new ModelApiFilter(propertyWalker.getModelIds(), propertyWalker.getParentIds(), propertyWalker.getMatchedPrefix(), propertyWalker.getNotMatchedPrefix());
return ModelApiFilter.builder()
.withModelId(propertyWalker.getModelIds())
.withParentIds(propertyWalker.getParentIds())
.withMatchPrefix(propertyWalker.getMatchedPrefix())
.withNotMatchPrefix(propertyWalker.getNotMatchedPrefix())
.build();
}
return null;
}
@@ -108,13 +148,137 @@ public class AbstractClassImpl<T extends AbstractClass> {
}
listParam.stream()
.filter(String::isEmpty)
.filter(StringUtils::isBlank)
.findAny()
.ifPresent(qName -> {
throw new IllegalArgumentException(StringUtils.capitalize(paramName) + " cannot be empty (i.e. '')");
});
}
protected Set<Pair<QName,Boolean>> parseModelIds(Set<String> modelIds, String apiSuffix)
{
return modelIds.stream().map(modelId ->
{
QName qName = null;
boolean filterIncludeSubClass = false;
int idx = modelId.lastIndexOf(' ');
if (idx > 0)
{
String suffix = modelId.substring(idx);
if (suffix.equalsIgnoreCase(" " + apiSuffix))
{
filterIncludeSubClass = true;
modelId = modelId.substring(0, idx);
}
}
try
{
qName = QName.createQName(modelId, this.namespaceService);
}
catch (Exception ex)
{
throw new InvalidArgumentException(modelId + " isn't a valid QName. " + ex.getMessage());
}
if (qName == null)
throw new InvalidArgumentException(modelId + " isn't a valid QName. ");
return new Pair<>(qName, filterIncludeSubClass);
}).collect(Collectors.toSet());
}
public T constructFromFilters(T abstractClass, org.alfresco.service.cmr.dictionary.ClassDefinition classDefinition, List<String> includes) {
if (includes != null && includes.contains(PARAM_INCLUDE_PROPERTIES))
{
List<PropertyDefinition> properties = Collections.emptyList();
ClassDefinition _classDefinition = this.classDefinitionMapper.fromDictionaryClassDefinition(classDefinition, dictionaryService);
if (_classDefinition.getProperties() != null)
{
properties = _classDefinition.getProperties();
}
abstractClass.setProperties(properties);
}
if (includes != null && includes.contains(PARAM_INCLUDE_ASSOCIATIONS))
{
List<Association> associations = getAssociations(classDefinition.getAssociations());
abstractClass.setAssociations(associations);
}
if (includes != null && includes.contains(PARAM_INCLUDE_MANDATORY_ASPECTS))
{
if (classDefinition.getDefaultAspectNames() != null)
{
List<String> aspects = classDefinition.getDefaultAspectNames().stream().map(QName::toPrefixString).collect(Collectors.toList());
abstractClass.setMandatoryAspects(aspects);
}
}
abstractClass.setIsContainer(classDefinition.isContainer());
abstractClass.setIsArchive(classDefinition.getArchive());
abstractClass.setIncludedInSupertypeQuery(classDefinition.getIncludedInSuperTypeQuery());
return abstractClass;
}
List<Association> getAssociations(Map<QName, AssociationDefinition> associationDefinitionMap)
{
Collection<AssociationDefinition> associationDefinitions = associationDefinitionMap.values();
if (associationDefinitions.size() == 0)
return Collections.emptyList();
List<Association> associations = new ArrayList<Association>();
for (AssociationDefinition definition : associationDefinitions)
{
Association association = new Association();
association.setId(definition.getName().toPrefixString());
association.setTitle(definition.getTitle());
association.setDescription(definition.getDescription());
association.setIsChild(definition.isChild());
association.setIsProtected(definition.isProtected());
AssociationSource source = new AssociationSource();
String sourceRole = definition.getSourceRoleName() != null ? definition.getSourceRoleName().toPrefixString() : null;
source.setRole(sourceRole);
String sourceClass = definition.getSourceClass() != null ? definition.getSourceClass().getName().toPrefixString() : null;
source.setCls(sourceClass);
source.setIsMany(definition.isSourceMany());
source.setIsMandatory(definition.isSourceMandatory());
AssociationSource target = new AssociationSource();
String targetRole = definition.getTargetRoleName() != null ? definition.getTargetRoleName().toPrefixString() : null;
target.setRole(targetRole);
String targetClass = definition.getTargetClass() != null ? definition.getTargetClass().getName().toPrefixString() : null;
target.setCls(targetClass);
target.setIsMany(definition.isTargetMany());
target.setIsMandatory(definition.isTargetMandatory());
target.setIsMandatoryEnforced(definition.isTargetMandatoryEnforced());
association.setSource(source);
association.setTarget(target);
associations.add(association);
}
return associations;
}
public static <T> Predicate<T> distinctByKey(Function<? super T, ?> keyExtractor) {
Map<Object, Boolean> seen = new ConcurrentHashMap<>();
return t -> seen.putIfAbsent(keyExtractor.apply(t), Boolean.TRUE) == null;
}
public static class ClassQueryWalker extends MapBasedQueryWalker
{
private Set<String> modelIds = null;
@@ -187,12 +351,8 @@ public class AbstractClassImpl<T extends AbstractClass> {
private String matchedPrefix;
private String notMatchedPrefix;
public ModelApiFilter(Set<String> modelIds, Set<String> parentIds, String matchedPrefix, String notMatchedPrefix)
public ModelApiFilter()
{
this.modelIds = modelIds;
this.parentIds = parentIds;
this.matchedPrefix = matchedPrefix;
this.notMatchedPrefix = notMatchedPrefix;
}
public Set<String> getModelIds()
@@ -214,5 +374,52 @@ public class AbstractClassImpl<T extends AbstractClass> {
{
return parentIds;
}
public static ModelApiFilterBuilder builder()
{
return new ModelApiFilterBuilder();
}
public static class ModelApiFilterBuilder
{
private Set<String> modelIds;
private Set<String> parentIds;
private String matchedPrefix;
private String notMatchedPrefix;
public ModelApiFilterBuilder withModelId(Set<String> modelIds)
{
this.modelIds = modelIds;
return this;
}
public ModelApiFilterBuilder withParentIds(Set<String> parentIds)
{
this.parentIds = parentIds;
return this;
}
public ModelApiFilterBuilder withMatchPrefix(String matchedPrefix)
{
this.matchedPrefix = matchedPrefix;
return this;
}
public ModelApiFilterBuilder withNotMatchPrefix(String notMatchedPrefix)
{
this.notMatchedPrefix = notMatchedPrefix;
return this;
}
public ModelApiFilter build()
{
ModelApiFilter modelApiFilter = new ModelApiFilter();
modelApiFilter.modelIds = modelIds;
modelApiFilter.parentIds = parentIds;
modelApiFilter.matchedPrefix = matchedPrefix;
modelApiFilter.notMatchedPrefix = notMatchedPrefix;
return modelApiFilter;
}
}
}
}

View File

@@ -26,10 +26,10 @@
package org.alfresco.rest.api.impl;
import org.alfresco.error.AlfrescoRuntimeException;
import org.alfresco.rest.api.Aspects;
import org.alfresco.rest.api.ClassDefinitionMapper;
import org.alfresco.rest.api.model.Aspect;
import org.alfresco.rest.api.model.PropertyDefinition;
import org.alfresco.rest.framework.core.exceptions.EntityNotFoundException;
import org.alfresco.rest.framework.core.exceptions.InvalidArgumentException;
import org.alfresco.rest.framework.resource.parameters.CollectionWithPagingInfo;
@@ -41,10 +41,12 @@ import org.alfresco.service.cmr.dictionary.ModelDefinition;
import org.alfresco.service.namespace.NamespaceException;
import org.alfresco.service.namespace.NamespacePrefixResolver;
import org.alfresco.service.namespace.QName;
import org.alfresco.util.Pair;
import org.alfresco.util.PropertyCheck;
import java.util.List;
import java.util.Collection;
import java.util.List;
import java.util.Set;
import java.util.stream.Collectors;
import java.util.stream.Stream;
@@ -76,37 +78,44 @@ public class AspectsImpl extends AbstractClassImpl<Aspect> implements Aspects
PropertyCheck.mandatory(this, "classDefinitionMapper", classDefinitionMapper);
}
AspectsImpl(DictionaryService dictionaryService, NamespacePrefixResolver namespaceService, ClassDefinitionMapper classDefinitionMapper)
{
super(dictionaryService, namespaceService, classDefinitionMapper);
}
@Override
public CollectionWithPagingInfo<Aspect> listAspects(Parameters params)
{
Paging paging = params.getPaging();
ModelApiFilter query = getQuery(params.getQuery());
Stream<QName> aspectList = null;
Stream<QName> aspectStream = null;
if (query != null && query.getModelIds() != null)
{
validateListParam(query.getModelIds(), PARAM_MODEL_IDS);
aspectList = query.getModelIds().parallelStream().map(this::getModelAspects).flatMap(Collection::parallelStream);
Set<Pair<QName, Boolean>> modelsFilter = parseModelIds(query.getModelIds(), PARAM_INCLUDE_SUBASPECTS);
aspectStream = modelsFilter.stream().map(this::getModelAspects).flatMap(Collection::stream);
}
else if (query != null && query.getParentIds() != null)
{
validateListParam(query.getParentIds(), PARAM_PARENT_IDS);
aspectList = query.getParentIds().parallelStream().map(this::getChildAspects).flatMap(Collection::parallelStream);
aspectStream = query.getParentIds().stream().map(this::getChildAspects).flatMap(Collection::stream);
}
else
{
aspectList = this.dictionaryService.getAllAspects().parallelStream();
aspectStream = this.dictionaryService.getAllAspects().stream();
}
List<Aspect> allAspects = aspectList.filter((qName) -> filterByNamespace(query, qName))
.map((qName) -> this.convertToAspect(dictionaryService.getAspect(qName)))
List<Aspect> allAspects = aspectStream.filter((qName) -> filterByNamespace(query, qName))
.filter(distinctByKey(QName::getPrefixString))
.map((qName) -> this.convertToAspect(dictionaryService.getAspect(qName), params.getInclude()))
.collect(Collectors.toList());
return createPagedResult(allAspects, paging);
}
@Override
public Aspect getAspectById(String aspectId)
public Aspect getAspect(String aspectId)
{
if (aspectId == null)
throw new InvalidArgumentException("Invalid parameter: unknown scheme specified");
@@ -125,32 +134,50 @@ public class AspectsImpl extends AbstractClassImpl<Aspect> implements Aspects
if (aspectDefinition == null)
throw new EntityNotFoundException(aspectId);
return this.convertToAspect(aspectDefinition);
return this.convertToAspect(aspectDefinition, ALL_PROPERTIES);
}
public Aspect convertToAspect(AspectDefinition aspectDefinition)
public Aspect convertToAspect(AspectDefinition aspectDefinition, List<String> includes)
{
List<PropertyDefinition> properties = this.classDefinitionMapper.fromDictionaryClassDefinition(aspectDefinition, dictionaryService).getProperties();
return new Aspect(aspectDefinition, dictionaryService, properties);
try
{
Aspect aspect = new Aspect(aspectDefinition, dictionaryService);
constructFromFilters(aspect, aspectDefinition, includes);
return aspect;
}
catch (Exception ex)
{
throw new AlfrescoRuntimeException("Failed to parse Aspect: " + aspectDefinition.getName() + " . " + ex.getMessage());
}
}
private Collection<QName> getModelAspects(String modelId)
private Collection<QName> getModelAspects(Pair<QName,Boolean> model)
{
ModelDefinition modelDefinition = null;
if (modelId == null)
throw new InvalidArgumentException("modelId is null");
try
{
modelDefinition = this.dictionaryService.getModel(QName.createQName(modelId, this.namespaceService));
modelDefinition = this.dictionaryService.getModel(model.getFirst());
}
catch (NamespaceException exception)
catch (Exception exception)
{
throw new InvalidArgumentException(exception.getMessage());
}
return this.dictionaryService.getAspects(modelDefinition.getName());
if (modelDefinition == null)
throw new EntityNotFoundException("model");
Collection<QName> aspects = this.dictionaryService.getAspects(modelDefinition.getName());
if (!model.getSecond()) // look for model aspects alone
return aspects;
Stream<QName> aspectStream = aspects.stream();
Stream<QName> childrenStream = aspects.stream()
.map(aspect -> this.dictionaryService.getSubAspects(aspect, false))
.flatMap(Collection::stream);
return Stream.concat(aspectStream, childrenStream).collect(Collectors.toList());
}
private Collection<QName> getChildAspects(String aspectId)

View File

@@ -26,10 +26,10 @@
package org.alfresco.rest.api.impl;
import org.alfresco.rest.api.Types;
import org.alfresco.error.AlfrescoRuntimeException;
import org.alfresco.rest.api.ClassDefinitionMapper;
import org.alfresco.rest.api.Types;
import org.alfresco.rest.api.model.Type;
import org.alfresco.rest.api.model.PropertyDefinition;
import org.alfresco.rest.framework.core.exceptions.EntityNotFoundException;
import org.alfresco.rest.framework.core.exceptions.InvalidArgumentException;
import org.alfresco.rest.framework.resource.parameters.CollectionWithPagingInfo;
@@ -41,10 +41,12 @@ import org.alfresco.service.cmr.dictionary.TypeDefinition;
import org.alfresco.service.namespace.NamespaceException;
import org.alfresco.service.namespace.NamespacePrefixResolver;
import org.alfresco.service.namespace.QName;
import org.alfresco.util.Pair;
import org.alfresco.util.PropertyCheck;
import java.util.List;
import java.util.Collection;
import java.util.List;
import java.util.Set;
import java.util.stream.Collectors;
import java.util.stream.Stream;
@@ -76,32 +78,40 @@ public class TypesImpl extends AbstractClassImpl<Type> implements Types
PropertyCheck.mandatory(this, "classDefinitionMapper", classDefinitionMapper);
}
TypesImpl(DictionaryService dictionaryService, NamespacePrefixResolver namespaceService, ClassDefinitionMapper classDefinitionMapper)
{
super(dictionaryService, namespaceService, classDefinitionMapper);
}
@Override
public CollectionWithPagingInfo<Type> listTypes(Parameters params)
{
Paging paging = params.getPaging();
ModelApiFilter query = getQuery(params.getQuery());
Stream<QName> typeList = null;
Stream<QName> typeStream = null;
if (query != null && query.getModelIds() != null)
{
validateListParam(query.getModelIds(), PARAM_MODEL_IDS);
typeList = query.getModelIds().parallelStream().map(this::getModelTypes).flatMap(Collection::parallelStream);
Set<Pair<QName, Boolean>> modelsFilter = parseModelIds(query.getModelIds(), PARAM_INCLUDE_SUBTYPES);
typeStream = modelsFilter.stream().map(this::getModelTypes).flatMap(Collection::stream);
}
else if (query != null && query.getParentIds() != null)
{
validateListParam(query.getParentIds(), PARAM_PARENT_IDS);
typeList = query.getParentIds().parallelStream().map(this::getChildTypes).flatMap(Collection::parallelStream);
typeStream = query.getParentIds().stream().map(this::getChildTypes).flatMap(Collection::stream);
}
else
{
typeList = this.dictionaryService.getAllTypes().parallelStream();
typeStream = this.dictionaryService.getAllTypes().stream();
}
List<Type> allTypes = typeList.filter((qName) -> filterByNamespace(query, qName))
.map((qName) -> this.convertToType(dictionaryService.getType(qName)))
List<Type> allTypes = typeStream
.filter((qName) -> filterByNamespace(query, qName))
.filter(distinctByKey(QName::getPrefixString))
.map((qName) -> this.convertToType(dictionaryService.getType(qName), params.getInclude()))
.collect(Collectors.toList());
return createPagedResult(allTypes, paging);
}
@@ -125,32 +135,50 @@ public class TypesImpl extends AbstractClassImpl<Type> implements Types
if (typeDefinition == null)
throw new EntityNotFoundException(typeId);
return this.convertToType(typeDefinition);
return this.convertToType(typeDefinition, ALL_PROPERTIES);
}
public Type convertToType(TypeDefinition typeDefinition)
public Type convertToType(TypeDefinition typeDefinition, List<String> includes)
{
List<PropertyDefinition> properties = this.classDefinitionMapper.fromDictionaryClassDefinition(typeDefinition, dictionaryService).getProperties();
return new Type(typeDefinition, dictionaryService, properties);
try
{
Type type = new Type(typeDefinition, dictionaryService);
constructFromFilters(type, typeDefinition, includes);
return type;
}
catch (Exception ex)
{
throw new AlfrescoRuntimeException("Failed to parse Type: " + typeDefinition.getName() + " . " + ex.getMessage());
}
}
private Collection<QName> getModelTypes(String modelId)
private Collection<QName> getModelTypes(Pair<QName,Boolean> model)
{
ModelDefinition modelDefinition = null;
if (modelId == null)
throw new InvalidArgumentException("modelId is null");
try
{
modelDefinition = this.dictionaryService.getModel(QName.createQName(modelId, this.namespaceService));
modelDefinition = this.dictionaryService.getModel(model.getFirst());
}
catch (NamespaceException exception)
catch (Exception exception)
{
throw new InvalidArgumentException(exception.getMessage());
}
return this.dictionaryService.getTypes(modelDefinition.getName());
if (modelDefinition == null)
throw new EntityNotFoundException("model");
Collection<QName> aspects = this.dictionaryService.getTypes(modelDefinition.getName());
if (!model.getSecond()) //look for model types alone
return aspects;
Stream<QName> aspectStream = aspects.stream();
Stream<QName> childrenStream = aspects.stream()
.map(aspect -> this.dictionaryService.getSubTypes(aspect, false))
.flatMap(Collection::stream);
return Stream.concat(aspectStream, childrenStream).collect(Collectors.toList());
}
private Collection<QName> getChildTypes(String typeId)

View File

@@ -26,18 +26,26 @@
package org.alfresco.rest.api.model;
import org.alfresco.service.cmr.dictionary.ModelDefinition;
import org.alfresco.service.cmr.dictionary.NamespaceDefinition;
import org.alfresco.service.cmr.i18n.MessageLookup;
import org.alfresco.service.namespace.QName;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Objects;
public abstract class AbstractClass extends ClassDefinition implements Comparable<AbstractClass>
{
String id;
String title;
String description;
String parentId;
protected String id;
protected String title;
protected String description;
protected String parentId;
protected Boolean isContainer = null;
protected Boolean isArchive = null;
protected Boolean includedInSupertypeQuery = null;
protected List<String> mandatoryAspects = null;
protected List<Association> associations = null;
protected Model model;
public String getId()
{
@@ -79,13 +87,64 @@ public abstract class AbstractClass extends ClassDefinition implements Comparabl
this.parentId = parentId;
}
<T> List<T> setList(List<T> sourceList)
public Model getModel()
{
if (sourceList == null)
{
return Collections.<T> emptyList();
}
return new ArrayList<>(sourceList);
return model;
}
public void setModel(Model model)
{
this.model = model;
}
public Boolean getIsContainer()
{
return isContainer;
}
public void setIsContainer(Boolean isContainer)
{
this.isContainer = isContainer;
}
public Boolean getIsArchive()
{
return isArchive;
}
public void setIsArchive(Boolean isArchive)
{
this.isArchive = isArchive;
}
public Boolean getIncludedInSupertypeQuery()
{
return includedInSupertypeQuery;
}
public void setIncludedInSupertypeQuery(Boolean includedInSupertypeQuery)
{
this.includedInSupertypeQuery = includedInSupertypeQuery;
}
public List<String> getMandatoryAspects()
{
return mandatoryAspects;
}
public void setMandatoryAspects(List<String> mandatoryAspects)
{
this.mandatoryAspects = mandatoryAspects;
}
public List<Association> getAssociations()
{
return associations;
}
public void setAssociations(List<Association> associations)
{
this.associations = associations;
}
String getParentNameAsString(QName parentQName)
@@ -97,13 +156,27 @@ public abstract class AbstractClass extends ClassDefinition implements Comparabl
return null;
}
Model getModelInfo(org.alfresco.service.cmr.dictionary.ClassDefinition classDefinition, MessageLookup messageLookup)
{
final ModelDefinition modelDefinition = classDefinition.getModel();
final String prefix = classDefinition.getName().toPrefixString().split(":")[0];
final NamespaceDefinition namespaceDefinition = modelDefinition.getNamespaces().stream()
.filter(definition -> definition.getPrefix().equals(prefix))
.findFirst()
.get();
final String modelId = modelDefinition.getName().toPrefixString();
final String author = modelDefinition.getAuthor();
final String description = modelDefinition.getDescription(messageLookup);
return new Model(modelId, author, description, namespaceDefinition.getUri(), namespaceDefinition.getPrefix());
}
@Override
public int hashCode()
{
final int prime = 31;
int result = 1;
result = prime * result + ((this.id == null) ? 0 : this.id.hashCode());
return result;
return Objects.hash(id, title, description, parentId, properties, isContainer, isArchive, includedInSupertypeQuery, mandatoryAspects, associations, model);
}
@Override

View File

@@ -29,21 +29,19 @@ package org.alfresco.rest.api.model;
import org.alfresco.service.cmr.dictionary.AspectDefinition;
import org.alfresco.service.cmr.i18n.MessageLookup;
import java.util.List;
public class Aspect extends AbstractClass
{
public Aspect()
{
}
public Aspect(AspectDefinition aspectDefinition, MessageLookup messageLookup, List<PropertyDefinition> properties)
public Aspect(AspectDefinition aspectDefinition, MessageLookup messageLookup)
{
this.id = aspectDefinition.getName().toPrefixString();
this.title = aspectDefinition.getTitle(messageLookup);
this.description = aspectDefinition.getDescription(messageLookup);
this.parentId = getParentNameAsString(aspectDefinition.getParentName());
this.properties = setList(properties);
this.model = getModelInfo(aspectDefinition, messageLookup);
}
@Override
@@ -55,6 +53,12 @@ public class Aspect extends AbstractClass
.append(", description=").append(this.description)
.append(", parentId=").append(parentId)
.append(", properties=").append(properties)
.append(", mandatoryAspects=").append(mandatoryAspects)
.append(", isContainer=").append(isContainer)
.append(", isArchive=").append(isArchive)
.append(", associations=").append(associations)
.append(", model=").append(model)
.append(", includedInSupertypeQuery=").append(includedInSupertypeQuery)
.append(']');
return builder.toString();
}

View File

@@ -0,0 +1,158 @@
/*
* #%L
* Alfresco Remote API
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.rest.api.model;
import java.util.Objects;
public class Association
{
private String id;
private String title;
private String description;
private Boolean isChild;
private Boolean isProtected;
private AssociationSource source = null;
private AssociationSource target = null;
public Association()
{
}
public Association(String id, String title, String description, Boolean isChild, Boolean isProtected, AssociationSource source, AssociationSource target)
{
this.id = id;
this.title = title;
this.description = description;
this.isChild = isChild;
this.isProtected = isProtected;
this.source = source;
this.target = target;
}
public String getId()
{
return id;
}
public void setId(String id)
{
this.id = id;
}
public String getTitle()
{
return title;
}
public void setTitle(String title)
{
this.title = title;
}
public String getDescription()
{
return description;
}
public void setDescription(String description)
{
this.description = description;
}
public Boolean getIsChild()
{
return isChild;
}
public void setIsChild(Boolean isChild)
{
this.isChild = isChild;
}
public Boolean getIsProtected()
{
return isProtected;
}
public void setIsProtected(Boolean isProtected)
{
this.isProtected = isProtected;
}
public AssociationSource getSource()
{
return source;
}
public void setSource(AssociationSource source)
{
this.source = source;
}
public AssociationSource getTarget()
{
return target;
}
public void setTarget(AssociationSource target)
{
this.target = target;
}
@Override
public boolean equals(Object obj)
{
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
Association other = (Association) obj;
return Objects.equals(id, other.getId()) &&
Objects.equals(title, other.getTitle()) &&
Objects.equals(description, other.getDescription()) &&
Objects.equals(isChild, other.getIsChild()) &&
Objects.equals(isProtected, other.getIsProtected()) &&
Objects.equals(source, other.getSource()) &&
Objects.equals(target, other.getTarget());
}
@Override
public String toString() {
StringBuilder builder = new StringBuilder(512);
builder.append("Association [id=").append(this.id)
.append(", title=").append(this.title)
.append(", description=").append(this.description)
.append(", isChild=").append(isChild)
.append(", isProtected=").append(isProtected)
.append(", source=").append(source)
.append(", target=").append(target)
.append(']');
return builder.toString();
}
}

View File

@@ -0,0 +1,129 @@
/*
* #%L
* Alfresco Remote API
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.rest.api.model;
import java.util.Objects;
public class AssociationSource {
private String role = null;
private String cls = null;
private Boolean isMany = null;
private Boolean isMandatory = null;
private Boolean isMandatoryEnforced = null;
public AssociationSource()
{
}
public AssociationSource(String role, String cls, Boolean isMany, Boolean isMandatory, Boolean isMandatoryEnforced)
{
this.role = role;
this.cls = cls;
this.isMany = isMany;
this.isMandatory = isMandatory;
this.isMandatoryEnforced = isMandatoryEnforced;
}
public String getRole()
{
return role;
}
public void setRole(String role)
{
this.role = role;
}
public String getCls()
{
return cls;
}
public void setCls(String cls)
{
this.cls = cls;
}
public Boolean getIsMany()
{
return isMany;
}
public void setIsMany(Boolean isMany)
{
this.isMany = isMany;
}
public Boolean getIsMandatory()
{
return isMandatory;
}
public void setIsMandatory(Boolean isMandatory)
{
this.isMandatory = isMandatory;
}
public Boolean getIsMandatoryEnforced()
{
return isMandatoryEnforced;
}
public void setIsMandatoryEnforced(Boolean isMandatoryEnforced)
{
this.isMandatoryEnforced = isMandatoryEnforced;
}
@Override
public boolean equals(Object obj)
{
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
AssociationSource other = (AssociationSource) obj;
return Objects.equals(role, other.getRole()) &&
Objects.equals(cls, other.getCls()) &&
Objects.equals(isMany, other.getIsMany()) &&
Objects.equals(isMandatory, other.getIsMandatory()) &&
Objects.equals(isMandatoryEnforced, other.getIsMandatoryEnforced());
}
@Override
public String toString() {
StringBuilder builder = new StringBuilder(512);
builder.append("AssociationSource [role=").append(this.role)
.append(", cls=").append(this.cls)
.append(", isMany=").append(this.isMany)
.append(", isMandatory=").append(isMandatory)
.append(", isMandatoryEnforced=").append(isMandatoryEnforced)
.append(']');
return builder.toString();
}
}

View File

@@ -0,0 +1,105 @@
/*
* #%L
* Alfresco Remote API
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.rest.api.model;
public class Model implements Comparable<Model>
{
private String id;
private String author;
private String description;
private String namespaceUri;
private String namespacePrefix;
public Model()
{
}
public Model(String name, String author, String description, String namespaceUri, String namespacePrefix)
{
this.id = name;
this.author = author;
this.description = description;
this.namespaceUri = namespaceUri;
this.namespacePrefix = namespacePrefix;
}
public String getId()
{
return id;
}
public void setId(String id)
{
this.id = id;
}
public String getAuthor()
{
return author;
}
public void setAuthor(String author)
{
this.author = author;
}
public String getDescription()
{
return description;
}
public void setDescription(String description)
{
this.description = description;
}
public String getNamespaceUri()
{
return namespaceUri;
}
public void setNamespaceUri(String namespaceUri)
{
this.namespaceUri = namespaceUri;
}
public String getNamespacePrefix()
{
return namespacePrefix;
}
public void setNamespacePrefix(String namespacePrefix)
{
this.namespacePrefix = namespacePrefix;
}
@Override
public int compareTo(Model model)
{
return this.id.compareTo(model.getId());
}
}

View File

@@ -29,21 +29,19 @@ package org.alfresco.rest.api.model;
import org.alfresco.service.cmr.dictionary.TypeDefinition;
import org.alfresco.service.cmr.i18n.MessageLookup;
import java.util.List;
public class Type extends AbstractClass
{
public Type()
{
}
public Type(TypeDefinition typeDefinition, MessageLookup messageLookup, List<PropertyDefinition> properties)
public Type(TypeDefinition typeDefinition, MessageLookup messageLookup)
{
this.id = typeDefinition.getName().toPrefixString();
this.title = typeDefinition.getTitle(messageLookup);
this.description = typeDefinition.getDescription(messageLookup);
this.parentId = getParentNameAsString(typeDefinition.getParentName());
this.properties = setList(properties);
this.model = getModelInfo(typeDefinition, messageLookup);
}
@Override
@@ -55,6 +53,12 @@ public class Type extends AbstractClass
.append(", description=").append(this.description)
.append(", parentId=").append(parentId)
.append(", properties=").append(properties)
.append(", mandatoryAspects=").append(mandatoryAspects)
.append(", isContainer=").append(isContainer)
.append(", isArchive=").append(isArchive)
.append(", associations=").append(associations)
.append(", model=").append(model)
.append(", includedInSupertypeQuery=").append(includedInSupertypeQuery)
.append(']');
return builder.toString();
}

View File

@@ -25,12 +25,6 @@
*/
package org.alfresco.rest.api.search;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.alfresco.repo.search.impl.querymodel.impl.db.DBStats;
import org.alfresco.repo.search.impl.querymodel.impl.db.SingleTaskRestartableWatch;
import org.alfresco.rest.api.model.Node;
import org.alfresco.rest.api.search.context.SearchRequestContext;
import org.alfresco.rest.api.search.impl.ResultMapper;
@@ -51,14 +45,14 @@ import org.alfresco.service.cmr.search.SearchParameters;
import org.alfresco.service.cmr.search.SearchService;
import org.alfresco.util.ParameterCheck;
import org.alfresco.util.PropertyCheck;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.extensions.webscripts.AbstractWebScript;
import org.springframework.extensions.webscripts.WebScriptRequest;
import org.springframework.extensions.webscripts.WebScriptResponse;
import org.springframework.util.StopWatch;
import org.springframework.util.StopWatch.TaskInfo;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
/**
* An implementation of the {{baseUrl}}/{{networkId}}/public/search/versions/1/search endpoint
@@ -68,15 +62,12 @@ import org.springframework.util.StopWatch.TaskInfo;
public class SearchApiWebscript extends AbstractWebScript implements RecognizedParamsExtractor, RequestReader, ResponseWriter,
InitializingBean
{
protected static final Log logger = LogFactory.getLog(SearchApiWebscript.class);
private ServiceRegistry serviceRegistry;
private SearchService searchService;
private SearchMapper searchMapper;
private ResultMapper resultMapper;
protected ApiAssistant assistant;
protected ResourceWebScriptHelper helper;
private boolean statsEnabled;
@Override
public void afterPropertiesSet()
@@ -91,7 +82,6 @@ public class SearchApiWebscript extends AbstractWebScript implements RecognizedP
@Override
public void execute(WebScriptRequest webScriptRequest, WebScriptResponse webScriptResponse) throws IOException
{
StopWatch apiStopWatch = new StopWatch();
try {
//Turn JSON into a Java object respresentation
SearchQuery searchQuery = extractJsonContent(webScriptRequest, assistant.getJsonHelper(), SearchQuery.class);
@@ -106,43 +96,12 @@ public class SearchApiWebscript extends AbstractWebScript implements RecognizedP
SearchParameters searchParams = searchMapper.toSearchParameters(params, searchQuery, searchRequestContext);
//Call searchService
apiStopWatch.start("nodes");
ResultSet results = searchService.query(searchParams);
apiStopWatch.stop();
//Turn solr results into JSON
apiStopWatch.start("props");
CollectionWithPagingInfo<Node> resultJson = resultMapper.toCollectionWithPagingInfo(params, searchRequestContext, searchQuery, results);
//Post-process the request and pass in params, eg. params.getFilter()
Object toRender = helper.processAdditionsToTheResponse(null, null, null, params, resultJson);
apiStopWatch.stop();
// store execution stats in a special header if enabled
if (statsEnabled)
{
// store execution time in a special header
StringBuilder sb = new StringBuilder();
sb.append("api={");
sb.append("tot=").append(apiStopWatch.getTotalTimeMillis()).append("ms,");
addStopWatchStats(sb, apiStopWatch);
sb.append("}; ");
sb.append("db={");
addStopWatchStats(sb, DBStats.queryStopWatch());
sb.append("}; ");
sb.append("query={");
addStopWatchStats(sb, DBStats.handlerStopWatch());
sb.append(",");
addStopWatchStats(sb, DBStats.aclReadStopWatch());
sb.append(",");
addStopWatchStats(sb, DBStats.aclOwnerStopWatch());
sb.append("}");
webScriptResponse.addHeader("X-Response-Stats", sb.toString());
}
//Write response
setResponse(webScriptResponse, DEFAULT_SUCCESS);
@@ -153,44 +112,6 @@ public class SearchApiWebscript extends AbstractWebScript implements RecognizedP
}
}
private void addStopWatchStats(StringBuilder sb, StopWatch watch)
{
boolean first = true;
for (TaskInfo task : watch.getTaskInfo())
{
if (first)
{
first = false;
}
else
{
sb.append(",");
}
sb.append(task.getTaskName())
.append("=")
.append(task.getTimeMillis())
.append("ms");
int pc = Math.round(100 * task.getTimeNanos() / watch.getTotalTimeNanos());
sb.append("(")
.append(pc).append("%")
.append(")");
}
}
private void addStopWatchStats(StringBuilder sb, SingleTaskRestartableWatch watch)
{
long decimillis = (watch.getTotalTimeMicros()+5)/100;
double millis = decimillis/10.0;
sb.append(watch.getName())
.append("=")
.append(millis)
.append("ms");
}
/**
* Gets the Params object, parameters come from the SearchQuery json not the request
* @param webScriptRequest
@@ -243,10 +164,4 @@ public class SearchApiWebscript extends AbstractWebScript implements RecognizedP
{
this.helper = helper;
}
// receiving as a string because of known issue: https://jira.spring.io/browse/SPR-9989
public void setStatsEnabled(String enabled) {
this.statsEnabled = Boolean.valueOf(enabled);
logger.info("API stats header: " + (this.statsEnabled ? "enabled" : "disabled"));
}
}

View File

@@ -2,7 +2,7 @@
* #%L
* Alfresco Remote API
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
@@ -26,6 +26,8 @@
package org.alfresco.rest.api.search.impl;
import static java.util.Optional.empty;
import static java.util.Optional.of;
import static org.alfresco.rest.api.search.impl.StoreMapper.DELETED;
import static org.alfresco.rest.api.search.impl.StoreMapper.HISTORY;
import static org.alfresco.rest.api.search.impl.StoreMapper.LIVE_NODES;
@@ -42,9 +44,10 @@ import java.util.Map;
import java.util.Map.Entry;
import java.util.Optional;
import java.util.Set;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.stream.Collectors;
import org.alfresco.repo.search.impl.solr.SolrJSONResultSet;
import org.alfresco.repo.search.SearchEngineResultSet;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.GenericBucket;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.GenericFacetResponse;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.GenericFacetResponse.FACET_TYPE;
@@ -153,12 +156,10 @@ public class ResultMapper
*/
public CollectionWithPagingInfo<Node> toCollectionWithPagingInfo(Params params, SearchRequestContext searchRequestContext, SearchQuery searchQuery, ResultSet results)
{
SearchContext context = null;
Integer total = null;
List<Node> noderesults = new ArrayList<Node>();
List<Node> noderesults = new ArrayList<>();
Map<String, UserInfo> mapUserInfo = new HashMap<>(10);
Map<NodeRef, List<Pair<String, List<String>>>> hightLighting = results.getHighlighting();
int notFound = 0;
Map<NodeRef, List<Pair<String, List<String>>>> highLighting = results.getHighlighting();
final AtomicInteger unknownNodeRefsCount = new AtomicInteger();
boolean isHistory = searchRequestContext.getStores().contains(StoreMapper.HISTORY);
for (ResultSetRow row:results)
@@ -169,7 +170,7 @@ public class ResultMapper
{
float f = row.getScore();
List<HighlightEntry> highlightEntries = null;
List<Pair<String, List<String>>> high = hightLighting.get(row.getNodeRef());
List<Pair<String, List<String>>> high = highLighting.get(row.getNodeRef());
if (high != null && !high.isEmpty())
{
@@ -185,26 +186,21 @@ public class ResultMapper
else
{
logger.debug("Unknown noderef returned from search results "+row.getNodeRef());
notFound++;
unknownNodeRefsCount.incrementAndGet();
}
}
SolrJSONResultSet solrResultSet = findSolrResultSet(results);
SearchContext context =
toSearchEngineResultSet(results)
.map(resultSet -> toSearchContext(resultSet, searchRequestContext, searchQuery))
.orElse(null);
if (solrResultSet != null)
{
//We used Solr for this query
context = toSearchContext(solrResultSet, searchRequestContext, searchQuery, notFound);
}
total = setTotal(results);
return CollectionWithPagingInfo.asPaged(params.getPaging(), noderesults, results.hasMore(), total, null, context);
return CollectionWithPagingInfo.asPaged(params.getPaging(), noderesults, results.hasMore(), setTotal(results), null, context);
}
/**
* Builds a node representation based on a ResultSetRow;
* @param searchRequestContext
*
* @param aRow
* @param params
* @param mapUserInfo
@@ -285,14 +281,14 @@ public class ResultMapper
/**
* Uses the results from Solr to set the Search Context
* @param SolrJSONResultSet
*
* @param searchQuery
* @return SearchContext
*/
public SearchContext toSearchContext(SolrJSONResultSet solrResultSet, SearchRequestContext searchRequestContext, SearchQuery searchQuery, int notFound)
public SearchContext toSearchContext(SearchEngineResultSet resultSet, SearchRequestContext searchRequestContext, SearchQuery searchQuery)
{
SearchContext context = null;
Map<String, Integer> facetQueries = solrResultSet.getFacetQueries();
Map<String, Integer> facetQueries = resultSet.getFacetQueries();
List<GenericFacetResponse> facets = new ArrayList<>();
List<FacetQueryContext> facetResults = null;
SpellCheckContext spellCheckContext = null;
@@ -330,7 +326,7 @@ public class ResultMapper
}
//Field Facets
Map<String, List<Pair<String, Integer>>> facetFields = solrResultSet.getFieldFacets();
Map<String, List<Pair<String, Integer>>> facetFields = resultSet.getFieldFacets();
if(FacetFormat.V2 == searchQuery.getFacetFormat())
{
facets.addAll(getFacetBucketsForFacetFieldsAsFacets(facetFields, searchQuery));
@@ -340,28 +336,29 @@ public class ResultMapper
ffcs.addAll(getFacetBucketsForFacetFields(facetFields, searchQuery));
}
Map<String, List<Pair<String, Integer>>> facetInterval = solrResultSet.getFacetIntervals();
Map<String, List<Pair<String, Integer>>> facetInterval = resultSet.getFacetIntervals();
facets.addAll(getGenericFacetsForIntervals(facetInterval, searchQuery));
Map<String,List<Map<String,String>>> facetRanges = solrResultSet.getFacetRanges();
Map<String,List<Map<String,String>>> facetRanges = resultSet.getFacetRanges();
facets.addAll(RangeResultMapper.getGenericFacetsForRanges(facetRanges, searchQuery.getFacetRanges()));
List<GenericFacetResponse> stats = getFieldStats(searchRequestContext, solrResultSet.getStats());
List<GenericFacetResponse> pimped = getPivots(searchRequestContext, solrResultSet.getPivotFacets(), stats);
List<GenericFacetResponse> stats = getFieldStats(searchRequestContext, resultSet.getStats());
List<GenericFacetResponse> pimped = getPivots(searchRequestContext, resultSet.getPivotFacets(), stats);
facets.addAll(pimped);
facets.addAll(stats);
//Spelling
SpellCheckResult spell = solrResultSet.getSpellCheckResult();
SpellCheckResult spell = resultSet.getSpellCheckResult();
if (spell != null && spell.getResultName() != null && !spell.getResults().isEmpty())
{
spellCheckContext = new SpellCheckContext(spell.getResultName(),spell.getResults());
}
//Put it all together
context = new SearchContext(solrResultSet.getLastIndexedTxId(), facets, facetResults, ffcs, spellCheckContext, searchRequestContext.includeRequest()?searchQuery:null);
context = new SearchContext(resultSet.getLastIndexedTxId(), facets, facetResults, ffcs, spellCheckContext, searchRequestContext.includeRequest()?searchQuery:null);
return isNullContext(context)?null:context;
}
public static boolean hasGroup(SearchQuery searchQuery)
{
if(searchQuery != null && searchQuery.getFacetQueries() != null)
@@ -618,26 +615,32 @@ public class ResultMapper
}
/**
* Gets SolrJSONResultSet class if there is one.
* @param results
* @return
* Tries to see if the input {@link ResultSet} or one of the wrapped {@link ResultSet}
* is an instance of {@link SearchEngineResultSet}.
* Since some concrete ResultSet implements the decorator patterns, the code
* assumes (in those cases) a nested structure with a maximum of 3 levels.
* Probably the code could be generalised better in order to scan a decorator
* chain with an unlimited depth, but that would require a change in the ResultSet interface.
*/
protected SolrJSONResultSet findSolrResultSet(ResultSet results)
protected Optional<SearchEngineResultSet> toSearchEngineResultSet(ResultSet results)
{
ResultSet theResultSet = results;
if (results instanceof FilteringResultSet)
{
theResultSet = ((FilteringResultSet) results).getUnFilteredResultSet();
// 1st level
results = ((FilteringResultSet) results).getUnFilteredResultSet();
// 2nd level
if (results instanceof FilteringResultSet)
{
results = ((FilteringResultSet) results).getUnFilteredResultSet();
}
}
if (theResultSet instanceof SolrJSONResultSet)
{
return (SolrJSONResultSet) theResultSet;
}
return null;
return results instanceof SearchEngineResultSet
? of(results).map(SearchEngineResultSet.class::cast)
: empty();
}
public CollectionWithPagingInfo<TupleList> toCollectionWithPagingInfo(JSONArray docs, SearchSQLQuery searchQuery) throws JSONException
{
if(docs == null )

View File

@@ -31,6 +31,7 @@ import java.util.Properties;
import javax.servlet.ServletContext;
import org.alfresco.httpclient.HttpClientFactory.SecureCommsType;
import org.alfresco.web.scripts.servlet.X509ServletFilterBase;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
@@ -70,7 +71,9 @@ public class AlfrescoX509ServletFilter extends X509ServletFilterBase
* Return true or false based on the property. This will switch on/off X509 enforcement in the X509ServletFilterBase.
*/
if (prop == null || "none".equals(prop))
if (prop == null ||
SecureCommsType.getType(prop) == SecureCommsType.NONE ||
SecureCommsType.getType(prop) == SecureCommsType.SECRET)
{
return false;
}

View File

@@ -23,7 +23,4 @@
# See issue REPO-2575 for details.
alfresco.restApi.basicAuthScheme=false
# REPO-4388 allow CORS headers in transaction response
webscripts.transaction.preserveHeadersPattern=Access-Control-.*
# REPO-5371 enable stats header in API response (only search atm)
webscripts.stats.enabled=false
webscripts.transaction.preserveHeadersPattern=Access-Control-.*

View File

@@ -1024,7 +1024,6 @@
<property name="helper" ref="webscriptHelper" />
<property name="resultMapper" ref="searchapiResultMapper" />
<property name="searchMapper" ref="searchapiSearchMapper" />
<property name="statsEnabled" value="${webscripts.stats.enabled}" />
</bean>
<bean id="webscript.org.alfresco.api.SearchSQLApiWebscript.post"

View File

@@ -3,7 +3,7 @@ communitysummary.system-information=Informaci\u00f3n del sistema
communitysummary.system-information.free-memory=Memoria libre (GB)
communitysummary.system-information.maximum-memory=Memoria m\u00e1xima (GB)
communitysummary.system-information.total-memory=Memoria total (GB)
communitysummary.system-information.cpus=UPCs
communitysummary.system-information.cpus=CPUs
communitysummary.system-information.java-home=Inicio de Java
communitysummary.system-information.java-version=Versi\u00f3n de Java

View File

@@ -4,31 +4,71 @@
<#macro dateFormat date>${date?string("dd MMM yyyy HH:mm:ss 'GMT'Z '('zzz')'")}</#macro>
<#macro propValue p>
<#if p.value??>
<#if p.value?is_date>
<@dateFormat p.value />
<#elseif p.value?is_boolean>
${p.value?string}
<#elseif p.value?is_number>
${p.value?c}
<#elseif p.value?is_string>
${p.value?html}
<#elseif p.value?is_hash>
<#assign result = "{"/>
<#assign first = true />
<#list p.value?keys as key>
<#if first = false>
<#assign result = result + ", "/>
<#attempt>
<#if p.value??>
<#if p.value?is_date>
<@dateFormat p.value />
<#elseif p.value?is_boolean>
${p.value?string}
<#elseif p.value?is_number>
${p.value?c}
<#elseif p.value?is_string>
${p.value?html}
<#elseif p.value?is_hash || p.value?is_enumerable>
<@convertToJSON p.value />
</#if>
<#else>
${null}
</#if>
<#recover>
<span style="color:red">${.error}</span>
</#attempt>
</#macro>
<#macro convertToJSON v>
<#attempt>
<#if v??>
<#if v?is_date>
<@dateFormat v />
<#elseif v?is_boolean>
${v?string}
<#elseif v?is_number>
${v?c}
<#elseif v?is_string>
"${v?string}"
<#elseif v?is_hash>
<@compress single_line=true>
{
<#assign first = true />
<#list v?keys as key>
<#if first = false>,</#if>
"${key}":
<#if v[key]??>
<@convertToJSON v[key] />
<#else>
${null}
</#if>
<#assign first = false/>
</#list>
}
</@compress>
<#elseif v?is_enumerable>
<#assign first = true />
<@compress single_line=true>
[
<#list v as item>
<#if first = false>,</#if>
<@convertToJSON item />
<#assign first = false/>
</#list>
]
</@compress>
</#if>
<#assign result = result + "${key}=${p.value[key]?html}" />
<#assign first = false/>
</#list>
<#assign result = result + "}"/>
${result}
</#if>
<#else>
${null}
</#if>
<#else>
${null}
</#if>
<#recover>
<span style="color:red">${.error}</span>
</#attempt>
</#macro>
<#macro contentUrl nodeRef prop>
${url.serviceContext}/api/node/${nodeRef?replace("://","/")}/content;${prop?url}

View File

@@ -66,6 +66,8 @@
<bean id="SOLRAuthenticationFilter" class="org.alfresco.repo.web.scripts.solr.SOLRAuthenticationFilter">
<property name="secureComms" value="${solr.secureComms}"/>
<property name="sharedSecret" value="${solr.sharedSecret}"/>
<property name="sharedSecretHeader" value="${solr.sharedSecret.header}"/>
</bean>
<bean id="WebscriptAuthenticationFilter" class="org.alfresco.repo.management.subsystems.ChainingSubsystemProxyFactory">

View File

@@ -39,6 +39,7 @@ import org.junit.runners.Suite;
org.alfresco.repo.web.scripts.workflow.WorkflowModelBuilderTest.class,
org.alfresco.repo.web.scripts.solr.StatsGetTest.class,
org.alfresco.repo.web.scripts.solr.SOLRSerializerTest.class,
org.alfresco.repo.web.scripts.solr.SOLRAuthenticationFilterTest.class,
org.alfresco.repo.web.util.PagingCursorTest.class,
org.alfresco.repo.web.util.paging.PagingTest.class,
org.alfresco.repo.webdav.GetMethodTest.class,

View File

@@ -0,0 +1,176 @@
/*
* #%L
* Alfresco Remote API
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.web.scripts.solr;
import org.alfresco.error.AlfrescoRuntimeException;
import org.junit.Test;
import org.mockito.Mockito;
import org.springframework.mock.web.MockHttpServletRequest;
import org.springframework.mock.web.MockHttpServletResponse;
import javax.servlet.FilterChain;
import javax.servlet.ServletContext;
import javax.servlet.ServletRequest;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import static org.junit.Assert.assertEquals;
public class SOLRAuthenticationFilterTest
{
@Test(expected = AlfrescoRuntimeException.class)
public void testSharedSecretNotConfigured() throws Exception
{
SOLRAuthenticationFilter filter = new SOLRAuthenticationFilter();
filter.setSecureComms(SOLRAuthenticationFilter.SecureCommsType.SECRET.name());
filter.afterPropertiesSet();
}
@Test(expected = AlfrescoRuntimeException.class)
public void testSharedHeaderNotConfigured() throws Exception
{
SOLRAuthenticationFilter filter = new SOLRAuthenticationFilter();
filter.setSecureComms(SOLRAuthenticationFilter.SecureCommsType.SECRET.name());
filter.setSharedSecret("shared-secret");
filter.setSharedSecretHeader("");
filter.afterPropertiesSet();
}
@Test
public void testHTTPSFilterAndSharedSecretSet() throws Exception
{
String headerKey = "test-header";
String sharedSecret = "shared-secret";
SOLRAuthenticationFilter filter = new SOLRAuthenticationFilter();
filter.setSecureComms(SOLRAuthenticationFilter.SecureCommsType.HTTPS.name());
filter.setSharedSecret(sharedSecret);
filter.setSharedSecretHeader(headerKey);
filter.afterPropertiesSet();
HttpServletRequest request = Mockito.mock(HttpServletRequest.class);
HttpServletResponse response = Mockito.mock(HttpServletResponse.class);
Mockito.when(request.getHeader(headerKey)).thenReturn(sharedSecret);
Mockito.when(request.isSecure()).thenReturn(true);
FilterChain chain = Mockito.mock(FilterChain.class);
filter.doFilter(Mockito.mock(ServletContext.class), request, response, chain);
Mockito.verify(chain, Mockito.times(1)).doFilter(request, response);
}
@Test(expected = AlfrescoRuntimeException.class)
public void testHTTPSFilterAndInsecureRequest() throws Exception
{
SOLRAuthenticationFilter filter = new SOLRAuthenticationFilter();
filter.setSecureComms(SOLRAuthenticationFilter.SecureCommsType.HTTPS.name());
filter.afterPropertiesSet();
HttpServletRequest request = Mockito.mock(HttpServletRequest.class);
HttpServletResponse response = Mockito.mock(HttpServletResponse.class);
Mockito.when(request.isSecure()).thenReturn(false);
FilterChain chain = Mockito.mock(FilterChain.class);
filter.doFilter(Mockito.mock(ServletContext.class), request, response, chain);
}
@Test
public void testNoAuthentication() throws Exception
{
SOLRAuthenticationFilter filter = new SOLRAuthenticationFilter();
filter.setSecureComms(SOLRAuthenticationFilter.SecureCommsType.NONE.name());
filter.afterPropertiesSet();
HttpServletRequest request = Mockito.mock(HttpServletRequest.class);
HttpServletResponse response = Mockito.mock(HttpServletResponse.class);
FilterChain chain = Mockito.mock(FilterChain.class);
filter.doFilter(Mockito.mock(ServletContext.class), request, response, chain);
Mockito.verify(chain, Mockito.times(1)).doFilter(request, response);
}
@Test
public void testSharedSecretFilter() throws Exception
{
String headerKey = "test-header";
String sharedSecret = "shared-secret";
SOLRAuthenticationFilter filter = new SOLRAuthenticationFilter();
filter.setSecureComms(SOLRAuthenticationFilter.SecureCommsType.SECRET.name());
filter.setSharedSecret(sharedSecret);
filter.setSharedSecretHeader(headerKey);
filter.afterPropertiesSet();
HttpServletRequest request = Mockito.mock(HttpServletRequest.class);
HttpServletResponse response = Mockito.mock(HttpServletResponse.class);
Mockito.when(request.getHeader(headerKey)).thenReturn(sharedSecret);
FilterChain chain = Mockito.mock(FilterChain.class);
filter.doFilter(Mockito.mock(ServletContext.class), request, response, chain);
Mockito.verify(chain, Mockito.times(1)).doFilter(request, response);
}
@Test
public void testSharedSecretDontMatch() throws Exception
{
String headerKey = "test-header";
String sharedSecret = "shared-secret";
SOLRAuthenticationFilter filter = new SOLRAuthenticationFilter();
filter.setSecureComms(SOLRAuthenticationFilter.SecureCommsType.SECRET.name());
filter.setSharedSecret(sharedSecret);
filter.setSharedSecretHeader(headerKey);
filter.afterPropertiesSet();
HttpServletRequest request = Mockito.mock(HttpServletRequest.class);
HttpServletResponse response = Mockito.mock(HttpServletResponse.class);
Mockito.when(request.getHeader(headerKey)).thenReturn("wrong-secret");
FilterChain chain = Mockito.mock(FilterChain.class);
filter.doFilter(Mockito.mock(ServletContext.class), request, response, chain);
Mockito.verify(chain, Mockito.times(0)).doFilter(request, response);
Mockito.verify(response).sendError(Mockito.eq(HttpServletResponse.SC_FORBIDDEN), Mockito.anyString());
}
@Test
public void testSharedHeaderNotPresent() throws Exception
{
String headerKey = "test-header";
String sharedSecret = "shared-secret";
SOLRAuthenticationFilter filter = new SOLRAuthenticationFilter();
filter.setSecureComms(SOLRAuthenticationFilter.SecureCommsType.SECRET.name());
filter.setSharedSecret(sharedSecret);
filter.setSharedSecretHeader(headerKey);
filter.afterPropertiesSet();
HttpServletRequest request = Mockito.mock(HttpServletRequest.class);
HttpServletResponse response = Mockito.mock(HttpServletResponse.class);
FilterChain chain = Mockito.mock(FilterChain.class);
filter.doFilter(Mockito.mock(ServletContext.class), request, response, chain);
Mockito.verify(chain, Mockito.times(0)).doFilter(request, response);
Mockito.verify(response).sendError(Mockito.eq(HttpServletResponse.SC_FORBIDDEN), Mockito.anyString());
}
}

View File

@@ -52,6 +52,7 @@ import java.util.stream.Collectors;
import java.util.stream.Stream;
import org.alfresco.repo.search.EmptyResultSet;
import org.alfresco.repo.search.SearchEngineResultSet;
import org.alfresco.repo.search.impl.solr.SolrJSONResultSet;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.GenericBucket;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.GenericFacetResponse;
@@ -303,7 +304,7 @@ public class ResultMapperTests
SearchQuery searchQuery = helper.searchQueryFromJson();
SearchRequestContext searchRequest = SearchRequestContext.from(searchQuery);
SearchParameters searchParams = searchMapper.toSearchParameters(EMPTY_PARAMS, searchQuery, searchRequest);
SearchContext searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
SearchContext searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
assertEquals(34l, searchContext.getConsistency().getlastTxId());
assertEquals(6, searchContext.getFacetQueries().size());
assertEquals(0,searchContext.getFacetQueries().get(0).getCount());
@@ -437,7 +438,7 @@ public class ResultMapperTests
SearchQuery searchQuery = helper.searchQueryFromJson();
SearchRequestContext searchRequest = SearchRequestContext.from(searchQuery);
SearchParameters searchParams = searchMapper.toSearchParameters(EMPTY_PARAMS, searchQuery, searchRequest);
SearchContext searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
SearchContext searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
//Facet intervals
List<GenericFacetResponse> intervalFacets = searchContext.getFacets().stream()
@@ -477,7 +478,7 @@ public class ResultMapperTests
SearchQuery searchQuery = helper.searchQueryFromJson();
SearchRequestContext searchRequest = SearchRequestContext.from(searchQuery);
SearchParameters searchParams = searchMapper.toSearchParameters(EMPTY_PARAMS, searchQuery, searchRequest);
SearchContext searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
SearchContext searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
//Numeric facet range
List<GenericFacetResponse> rangeFacets = searchContext.getFacets().stream()
@@ -531,7 +532,7 @@ public class ResultMapperTests
SearchQuery searchQuery = helper.extractFromJson(updatedJSON);
SearchRequestContext searchRequest = SearchRequestContext.from(searchQuery);
SearchParameters searchParams = searchMapper.toSearchParameters(EMPTY_PARAMS, searchQuery, searchRequest);
SearchContext searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
SearchContext searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
//Numeric facet range
List<GenericFacetResponse> rangeFacets = searchContext.getFacets().stream()
@@ -575,7 +576,7 @@ public class ResultMapperTests
ResultSet results = mockResultset(expectedResponse);
SearchQuery searchQuery = helper.extractFromJson(jsonQuery);
SearchRequestContext searchRequest = SearchRequestContext.from(searchQuery);
SearchContext searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
SearchContext searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
assertEquals(34l, searchContext.getConsistency().getlastTxId());
assertEquals(null, searchContext.getFacetQueries());
assertEquals(1, searchContext.getFacets().size());
@@ -610,7 +611,7 @@ public class ResultMapperTests
ResultSet results = mockResultset(expectedResponse);
SearchQuery searchQuery = helper.extractFromJson(jsonQuery);
SearchRequestContext searchRequest = SearchRequestContext.from(searchQuery);
SearchContext searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
SearchContext searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
assertEquals(34l, searchContext.getConsistency().getlastTxId());
assertEquals(null, searchContext.getFacetQueries());
assertEquals(2, searchContext.getFacets().size());
@@ -648,7 +649,7 @@ public class ResultMapperTests
ResultSet results = mockResultset(expectedResponse);
SearchQuery searchQuery = helper.extractFromJson(jsonQuery);
SearchRequestContext searchRequest = SearchRequestContext.from(searchQuery);
SearchContext searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
SearchContext searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
assertEquals(34l, searchContext.getConsistency().getlastTxId());
assertTrue(searchContext.getFacets().isEmpty());
assertEquals(3,searchContext.getFacetQueries().size());
@@ -722,7 +723,7 @@ public class ResultMapperTests
ResultSet results = mockResultset(expectedResponse);
SearchQuery searchQuery = helper.extractFromJson(jsonQuery);
SearchRequestContext searchRequest = SearchRequestContext.from(searchQuery);
SearchContext searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
SearchContext searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
assertEquals(34l, searchContext.getConsistency().getlastTxId());
assertEquals(null, searchContext.getFacetQueries());
assertEquals(1, searchContext.getFacets().size());
@@ -738,7 +739,7 @@ public class ResultMapperTests
searchQuery = helper.extractFromJson(jsonQuery);
results = mockResultset(expectedResponse);
searchRequest = SearchRequestContext.from(searchQuery);
searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
assertEquals(34l, searchContext.getConsistency().getlastTxId());
assertEquals(3,searchContext.getFacetQueries().size());
assertEquals("small",searchContext.getFacetQueries().get(0).getLabel());
@@ -759,7 +760,7 @@ public class ResultMapperTests
+ "\"processedDenies\":true, \"lastIndexedTx\":34}";
results = mockResultset(expectedResponse);
searchQuery = helper.extractFromJson(jsonQuery);
searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
assertFalse(searchContext.getFacetsFields().isEmpty());
assertTrue(searchContext.getFacets().isEmpty());
assertEquals("creator",searchContext.getFacetsFields().get(0).getLabel());
@@ -770,7 +771,7 @@ public class ResultMapperTests
assertEquals("modifier",searchContext.getFacetsFields().get(1).getLabel());
jsonQuery = jsonQuery.replace("V1", "V2");
searchQuery = helper.extractFromJson(jsonQuery);
searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
assertTrue(searchContext.getFacetsFields().isEmpty());
assertFalse(searchContext.getFacets().isEmpty());
assertEquals("creator",searchContext.getFacets().get(0).getLabel());
@@ -835,7 +836,7 @@ public class ResultMapperTests
ResultSet results = mockResultset(expectedResponse);
SearchQuery searchQuery = helper.extractFromJson(jsonQuery);
SearchRequestContext searchRequest = SearchRequestContext.from(searchQuery);
SearchContext searchContext = mapper.toSearchContext((SolrJSONResultSet) results, searchRequest, searchQuery, 0);
SearchContext searchContext = mapper.toSearchContext((SearchEngineResultSet) results, searchRequest, searchQuery);
assertEquals(34l, searchContext.getConsistency().getlastTxId());
assertEquals(null, searchContext.getFacetQueries());
assertEquals(3, searchContext.getFacets().size());

View File

@@ -0,0 +1,289 @@
/*
* #%L
* Alfresco Remote API
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.rest.api.tests;
import com.google.common.collect.ImmutableList;
import org.alfresco.rest.api.model.Association;
import org.alfresco.rest.api.model.AssociationSource;
import org.alfresco.rest.api.model.Model;
import org.alfresco.rest.api.tests.client.PublicApiClient;
import org.alfresco.rest.api.tests.client.data.Aspect;
import org.alfresco.rest.api.tests.client.data.Type;
import org.junit.Before;
import java.util.HashMap;
import java.util.Map;
import java.util.Collections;
import java.util.Arrays;
import java.util.List;
public class BaseModelApiTest extends AbstractBaseApiTest
{
PublicApiClient.ListResponse<Aspect> aspects = null;
Aspect aspect = null, childAspect = null, smartFilterAspect = null,
rescanAspect = null, testAspect = null, testAllAspect = null;
PublicApiClient.ListResponse<Type> types = null;
Type type = null, whitePaperType = null, docType = null, publishableType = null, apiBaseType = null,
apiFileType = null, apiFileDerivedType = null, apiForcedType = null, apiFileDerivedNoArchiveType = null,
apiFolderType = null, apiOverrideType = null, apiOverride2Type = null, apiOverride3Type = null, apiNamedPropConstraintType = null;
List<Type> allTypes = null;
PublicApiClient.Paging paging = getPaging(0, 10);
Map<String, String> otherParams = new HashMap<>();
@Before
public void setup() throws Exception
{
super.setup();
Model myCompanyModel = new Model();
myCompanyModel.setAuthor("Administrator");
myCompanyModel.setId("mycompany:model");
myCompanyModel.setNamespaceUri("http://www.mycompany.com/model/finance/1.0");
myCompanyModel.setNamespacePrefix("mycompany");
Model scanModel = new Model();
scanModel.setAuthor("Administrator");
scanModel.setId("test:scan");
scanModel.setNamespaceUri("http://www.test.com/model/account/1.0");
scanModel.setNamespacePrefix("test");
testAspect = new org.alfresco.rest.api.tests.client.data.Aspect();
testAspect.setId("mycompany:testAspect");
testAspect.setTitle("Test Aspect");
testAspect.setModel(myCompanyModel);
testAspect.setIsContainer(false);
testAspect.setIncludedInSupertypeQuery(true);
testAspect.setIsArchive(true);
childAspect = new org.alfresco.rest.api.tests.client.data.Aspect();
childAspect.setId("mycompany:childAspect");
childAspect.setTitle("Child Aspect");
childAspect.setDescription("Child Aspect Description");
childAspect.setParentId("smf:smartFolder");
childAspect.setModel(myCompanyModel);
childAspect.setIsContainer(false);
childAspect.setIncludedInSupertypeQuery(true);
rescanAspect = new org.alfresco.rest.api.tests.client.data.Aspect();
rescanAspect.setId("test:rescan");
rescanAspect.setTitle("rescan");
rescanAspect.setDescription("Doc that required to scan ");
rescanAspect.setModel(scanModel);
rescanAspect.setIsContainer(false);
rescanAspect.setIncludedInSupertypeQuery(true);
smartFilterAspect = new org.alfresco.rest.api.tests.client.data.Aspect();
smartFilterAspect.setId("test:smartFilter");
smartFilterAspect.setTitle("Smart filter");
smartFilterAspect.setDescription("Smart Filter");
smartFilterAspect.setParentId("mycompany:testAspect");
smartFilterAspect.setModel(scanModel);
smartFilterAspect.setIsContainer(false);
smartFilterAspect.setIsArchive(true);
smartFilterAspect.setIncludedInSupertypeQuery(true);
whitePaperType = new org.alfresco.rest.api.tests.client.data.Type();
whitePaperType.setId("mycompany:whitepaper");
whitePaperType.setTitle("whitepaper");
whitePaperType.setDescription("Whitepaper");
whitePaperType.setParentId("mycompany:doc");
whitePaperType.setModel(myCompanyModel);
whitePaperType.setIsContainer(false);
whitePaperType.setIsArchive(true);
whitePaperType.setIncludedInSupertypeQuery(true);
docType = new org.alfresco.rest.api.tests.client.data.Type();
docType.setId("mycompany:doc");
docType.setTitle("doc");
docType.setDescription("Doc");
docType.setParentId("cm:content");
docType.setModel(myCompanyModel);
docType.setIsContainer(false);
docType.setIsArchive(true);
docType.setIncludedInSupertypeQuery(true);
publishableType = new org.alfresco.rest.api.tests.client.data.Type();
publishableType.setId("test:publishable");
publishableType.setParentId("mycompany:doc");
publishableType.setIsContainer(false);
publishableType.setIsArchive(true);
publishableType.setIncludedInSupertypeQuery(true);
Model testModel = new Model();
testModel.setAuthor("Administrator");
testModel.setId("api:apiModel");
testModel.setNamespaceUri("http://www.api.t2/model/1.0");
testModel.setNamespacePrefix("test2");
Model apiModel = new Model();
apiModel.setAuthor("Administrator");
apiModel.setId("api:apiModel");
apiModel.setNamespaceUri("http://www.api.t1/model/1.0");
apiModel.setNamespacePrefix("api");
AssociationSource testAllAspectSource = new AssociationSource(null, "test2:aspect-all", true, true, null);
AssociationSource testAllAspectTarget = new AssociationSource(null, "api:referenceable", false, false, false);
Association testAllAspectAssociation = new Association("api:assoc-all", null, null, null, false, testAllAspectSource, testAllAspectTarget);
testAllAspect = new org.alfresco.rest.api.tests.client.data.Aspect();
testAllAspect.setId("test2:aspect-all");
testAllAspect.setTitle("Aspect derived from other namespace");
testAllAspect.setIsArchive(false);
testAllAspect.setIncludedInSupertypeQuery(false);
testAllAspect.setIsContainer(false);
testAllAspect.setModel(testModel);
testAllAspect.setAssociations(Collections.singletonList(testAllAspectAssociation));
testAllAspect.setMandatoryAspects(Arrays.asList("test2:aspect-three", "api:aspect-one", "api:aspect-two"));
AssociationSource apiBaseSource = new AssociationSource(null, "api:base", false, true, null);
AssociationSource apiBaseTarget = new AssociationSource(null, "api:base", true, false, false);
Association apiBaseAssociation = new Association("api:assoc1", null, null, false, false, apiBaseSource, apiBaseTarget);
AssociationSource apiChildSource = new AssociationSource(null, "api:base", true, true, null);
AssociationSource apiChildTarget = new AssociationSource(null, "api:referenceable", false, false, false);
Association apiChildAssociation = new Association("api:childassoc1", null, null, true, false, apiChildSource, apiChildTarget);
AssociationSource apiBaseSource2 = new AssociationSource(null, "api:base", true, true, null);
AssociationSource apiBaseTarget2 = new AssociationSource(null, "api:referenceable", false, false, false);
Association apiBaseAssociation2 = new Association("api:assoc2", null, null, false, false, apiBaseSource2, apiBaseTarget2);
AssociationSource apiChildPropagateSource = new AssociationSource(null, "api:base", true, true, null);
AssociationSource apiChildPropagateTarget = new AssociationSource(null, "api:referenceable", false, false, false);
Association apiChildPropagateAssociation = new Association("api:childassocPropagate", null, null, true, false, apiChildPropagateSource, apiChildPropagateTarget);
apiBaseType = new org.alfresco.rest.api.tests.client.data.Type();
apiBaseType.setId("api:base");
apiBaseType.setTitle("Base");
apiBaseType.setDescription("The Base Type");
apiBaseType.setIncludedInSupertypeQuery(true);
apiBaseType.setIsContainer(true);
apiBaseType.setModel(apiModel);
apiBaseType.setAssociations(Arrays.asList(apiBaseAssociation, apiChildAssociation, apiBaseAssociation2, apiChildPropagateAssociation));
apiBaseType.setMandatoryAspects(Collections.singletonList("api:referenceable"));
apiForcedType = new org.alfresco.rest.api.tests.client.data.Type();
apiForcedType.setId("api:enforced");
apiForcedType.setParentId("api:base");
apiForcedType.setIncludedInSupertypeQuery(true);
apiForcedType.setIsContainer(true);
apiForcedType.setModel(apiModel);
apiForcedType.setAssociations(Arrays.asList(apiBaseAssociation2, apiChildPropagateAssociation, apiBaseAssociation, apiChildAssociation));
apiForcedType.setMandatoryAspects(Collections.singletonList("api:referenceable"));
AssociationSource apiChildSource2 = new AssociationSource(null, "api:file", false, true, null);
AssociationSource apiChildTarget2 = new AssociationSource(null, "api:referenceable", true, false, false);
Association apiChildAssociation2 = new Association("api:childassoc2", null, null, true, false, apiChildSource2, apiChildTarget2);
apiFileType = new org.alfresco.rest.api.tests.client.data.Type();
apiFileType.setId("api:file");
apiFileType.setParentId("api:base");
apiFileType.setIsArchive(true);
apiFileType.setIncludedInSupertypeQuery(true);
apiFileType.setIsContainer(true);
apiFileType.setModel(apiModel);
apiFileType.setAssociations(Arrays.asList(apiBaseAssociation2, apiChildAssociation2, apiChildPropagateAssociation, apiBaseAssociation, apiChildAssociation));
apiFileType.setMandatoryAspects(Collections.singletonList("api:referenceable"));
apiFileDerivedType = new org.alfresco.rest.api.tests.client.data.Type();
apiFileDerivedType.setId("api:file-derived");
apiFileDerivedType.setParentId("api:file");
apiFileDerivedType.setIsArchive(true);
apiFileDerivedType.setIncludedInSupertypeQuery(true);
apiFileDerivedType.setIsContainer(true);
apiFileDerivedType.setModel(apiModel);
apiFileDerivedType.setAssociations(Arrays.asList(apiBaseAssociation2, apiChildAssociation2, apiChildPropagateAssociation, apiBaseAssociation, apiChildAssociation));
apiFileDerivedType.setMandatoryAspects(Collections.singletonList("api:referenceable"));
apiFileDerivedNoArchiveType = new org.alfresco.rest.api.tests.client.data.Type();
apiFileDerivedNoArchiveType.setId("api:file-derived-no-archive");
apiFileDerivedNoArchiveType.setParentId("api:file");
apiFileDerivedNoArchiveType.setIsArchive(false);
apiFileDerivedNoArchiveType.setIncludedInSupertypeQuery(true);
apiFileDerivedNoArchiveType.setIsContainer(true);
apiFileDerivedNoArchiveType.setModel(apiModel);
apiFileDerivedNoArchiveType.setAssociations(Arrays.asList(apiBaseAssociation2, apiChildAssociation2, apiChildPropagateAssociation, apiBaseAssociation, apiChildAssociation));
apiFileDerivedNoArchiveType.setMandatoryAspects(Collections.singletonList("api:referenceable"));
apiFolderType = new org.alfresco.rest.api.tests.client.data.Type();
apiFolderType.setId("api:folder");
apiFolderType.setParentId("api:base");
apiFolderType.setIncludedInSupertypeQuery(true);
apiFolderType.setIsContainer(true);
apiFolderType.setModel(apiModel);
apiFolderType.setAssociations(Arrays.asList(apiBaseAssociation2, apiChildPropagateAssociation, apiBaseAssociation, apiChildAssociation));
apiFolderType.setMandatoryAspects(Collections.singletonList("api:referenceable"));
apiOverrideType = new org.alfresco.rest.api.tests.client.data.Type();
apiOverrideType.setId("api:overridetype1");
apiOverrideType.setParentId("api:base");
apiOverrideType.setIncludedInSupertypeQuery(true);
apiOverrideType.setIsContainer(false);
apiOverrideType.setModel(apiModel);
apiOverrideType.setAssociations(Collections.emptyList());
apiOverrideType.setMandatoryAspects(Collections.emptyList());
apiOverride2Type = new org.alfresco.rest.api.tests.client.data.Type();
apiOverride2Type.setId("api:overridetype2");
apiOverride2Type.setParentId("api:overridetype1");
apiOverride2Type.setIncludedInSupertypeQuery(true);
apiOverride2Type.setIsContainer(false);
apiOverride2Type.setModel(apiModel);
apiOverride2Type.setAssociations(Collections.emptyList());
apiOverride2Type.setMandatoryAspects(Collections.emptyList());
apiOverride3Type = new org.alfresco.rest.api.tests.client.data.Type();
apiOverride3Type.setId("api:overridetype3");
apiOverride3Type.setParentId("api:overridetype2");
apiOverride3Type.setIncludedInSupertypeQuery(true);
apiOverride3Type.setIsContainer(false);
apiOverride3Type.setModel(apiModel);
apiOverride3Type.setAssociations(Collections.emptyList());
apiOverride3Type.setMandatoryAspects(Collections.emptyList());
apiNamedPropConstraintType = new org.alfresco.rest.api.tests.client.data.Type();
apiNamedPropConstraintType.setId("api:typeWithNamedPropConstraint");
apiNamedPropConstraintType.setTitle("Type with named property-defined constraint.");
apiNamedPropConstraintType.setDescription("A type with a named constraint defined within one of its properties.");
apiNamedPropConstraintType.setParentId("api:overridetype2");
apiNamedPropConstraintType.setIncludedInSupertypeQuery(true);
apiNamedPropConstraintType.setIsContainer(false);
apiNamedPropConstraintType.setModel(apiModel);
apiNamedPropConstraintType.setAssociations(Collections.emptyList());
apiNamedPropConstraintType.setMandatoryAspects(Collections.emptyList());
allTypes = ImmutableList.of(apiBaseType, apiForcedType, apiFileType, apiFileDerivedType,
apiFileDerivedNoArchiveType, apiFolderType, apiOverrideType, apiOverride2Type,
apiOverride3Type, apiNamedPropConstraintType);
}
@Override
public String getScope()
{
return "public";
}
}

View File

@@ -27,53 +27,22 @@
package org.alfresco.rest.api.tests;
import org.alfresco.repo.security.authentication.AuthenticationUtil;
import org.alfresco.rest.api.tests.client.PublicApiClient;
import org.alfresco.rest.api.tests.client.PublicApiException;
import org.alfresco.rest.api.tests.client.RequestContext;
import org.alfresco.rest.api.tests.client.data.Aspect;
import org.apache.commons.httpclient.HttpStatus;
import org.junit.Before;
import org.junit.Test;
import java.util.HashMap;
import java.util.Map;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertTrue;
import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertNull;
import static org.junit.Assert.fail;
public class TestAspects extends AbstractBaseApiTest
public class TestAspects extends BaseModelApiTest
{
private PublicApiClient.Paging paging = getPaging(0, 10);
PublicApiClient.ListResponse<org.alfresco.rest.api.tests.client.data.Aspect> aspects = null;
org.alfresco.rest.api.tests.client.data.Aspect aspect, childAspect = null, smartFilter = null, rescanAspect = null;
Map<String, String> otherParams = new HashMap<>();
@Before
public void setup() throws Exception
{
super.setup();
childAspect = new org.alfresco.rest.api.tests.client.data.Aspect();
childAspect.setId("mycompany:childAspect");
childAspect.setTitle("Child Aspect");
childAspect.setDescription("Child Aspect Description");
childAspect.setParentId("smf:smartFolder");
rescanAspect = new org.alfresco.rest.api.tests.client.data.Aspect();
rescanAspect.setId("test:rescan");
rescanAspect.setTitle("rescan");
rescanAspect.setDescription("Doc that required to scan ");
smartFilter = new org.alfresco.rest.api.tests.client.data.Aspect();
smartFilter.setId("test:smartFilter");
smartFilter.setTitle("Smart filter");
smartFilter.setDescription("Smart Filter");
smartFilter.setParentId("cm:auditable");
}
@Test
public void testAllAspects() throws PublicApiException
{
@@ -113,29 +82,32 @@ public class TestAspects extends AbstractBaseApiTest
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(parentIds in ('smf:smartFolder','cm:auditable'))");
otherParams.put("where", "(parentId in ('smf:smartFolder','mycompany:testAspect'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
aspects.getList().get(1).expected(childAspect);
aspects.getList().get(0).expected(childAspect);
aspects.getList().get(1).expected(testAspect);
aspects.getList().get(3).expected(smartFilterAspect);
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(4));
assertFalse(aspects.getPaging().getHasMoreItems());
otherParams.put("where", "(parentIds in ('smf:smartFolder','cm:auditable') AND namespaceUri matches('http://www.test.*'))");
otherParams.put("where", "(parentId in ('smf:smartFolder','mycompany:testAspect') AND namespaceUri matches('http://www.test.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
aspects.getList().get(0).expected(smartFilter);
aspects.getList().get(0).expected(smartFilterAspect);
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(1));
otherParams.put("where", "(parentIds in ('smf:smartFolder','cm:auditable') AND not namespaceUri matches('http://www.test.*'))");
otherParams.put("where", "(parentId in ('smf:smartFolder', 'mycompany:testAspect') AND not namespaceUri matches('http://www.test.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
aspects.getList().get(1).expected(childAspect);
aspects.getList().get(0).expected(childAspect);
aspects.getList().get(1).expected(testAspect);
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(3));
// match everything
otherParams.put("where", "(parentIds in ('smf:smartFolder','cm:auditable') AND namespaceUri matches('.*'))");
otherParams.put("where", "(parentId in ('smf:smartFolder','mycompany:testAspect') AND namespaceUri matches('.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(4));
// match nothing
otherParams.put("where", "(parentIds in ('smf:smartFolder,cm:auditable') AND not namespaceUri matches('.*'))");
otherParams.put("where", "(parentId in ('smf:smartFolder', 'mycompany:testAspect') AND not namespaceUri matches('.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(0));
}
@@ -146,31 +118,148 @@ public class TestAspects extends AbstractBaseApiTest
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(6));
assertFalse(aspects.getPaging().getHasMoreItems());
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan') AND namespaceUri matches('http://www.test.*'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND namespaceUri matches('http://www.test.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
aspects.getList().get(0).expected(rescanAspect);
aspects.getList().get(1).expected(smartFilter);
aspects.getList().get(1).expected(smartFilterAspect);
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(2));
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan') AND not namespaceUri matches('http://www.test.*'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND not namespaceUri matches('http://www.test.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(4));
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan') AND namespaceUri matches('.*'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND namespaceUri matches('.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(6));
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan') AND not namespaceUri matches('.*'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND not namespaceUri matches('.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(0));
}
@Test
public void testIncludeProperty() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND namespaceUri matches('http://www.test.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
aspects.getList().get(0).expected(rescanAspect);
assertNull(aspects.getList().get(0).getProperties());
aspects.getList().get(1).expected(smartFilterAspect);
assertNull(aspects.getList().get(1).getProperties());
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND namespaceUri matches('http://www.test.*'))");
otherParams.put("include", "properties");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
aspects.getList().get(0).expected(rescanAspect);
assertNotNull(aspects.getList().get(0).getProperties());
aspects.getList().get(1).expected(smartFilterAspect);
assertNotNull(aspects.getList().get(0).getProperties());
}
@Test
public void testIncludeAssociation() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('api:apiModel'))");
otherParams.put("include", "associations");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(6));
for (Aspect aspect : aspects.getList())
{
assertNotNull(aspect.getAssociations());
assertNull(aspect.getProperties());
assertNull(aspect.getMandatoryAspects());
}
assertTrue(aspects.getList().get(0).getAssociations().isEmpty());
assertTrue(aspects.getList().get(1).getAssociations().isEmpty());
assertTrue(aspects.getList().get(2).getAssociations().isEmpty());
assertTrue(aspects.getList().get(3).getAssociations().isEmpty());
assertEquals(aspects.getList().get(4).getAssociations(), testAllAspect.getAssociations());
assertTrue(aspects.getList().get(5).getAssociations().isEmpty());
}
@Test
public void testIncludeMandatoryAspect() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('api:apiModel'))");
otherParams.put("include", "mandatoryAspects");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(6));
for (Aspect aspect : aspects.getList())
{
assertNotNull(aspect.getMandatoryAspects());
assertNull(aspect.getProperties());
assertNull(aspect.getAssociations());
}
assertTrue(aspects.getList().get(0).getMandatoryAspects().isEmpty());
assertTrue(aspects.getList().get(1).getMandatoryAspects().isEmpty());
assertTrue(aspects.getList().get(2).getMandatoryAspects().isEmpty());
assertTrue(aspects.getList().get(3).getMandatoryAspects().isEmpty());
assertEquals(aspects.getList().get(4).getMandatoryAspects(), testAllAspect.getMandatoryAspects());
assertTrue(aspects.getList().get(5).getMandatoryAspects().isEmpty());
}
@Test
public void testIncludes() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('api:apiModel'))");
otherParams.put("include", "associations,mandatoryAspects");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(6));
for (Aspect aspect : aspects.getList())
{
assertNotNull(aspect.getAssociations());
assertNotNull(aspect.getMandatoryAspects());
assertNull(aspect.getProperties());
}
assertTrue(aspects.getList().get(0).getAssociations().isEmpty());
assertTrue(aspects.getList().get(1).getAssociations().isEmpty());
assertTrue(aspects.getList().get(2).getAssociations().isEmpty());
assertTrue(aspects.getList().get(3).getAssociations().isEmpty());
assertEquals(aspects.getList().get(4).getAssociations(), testAllAspect.getAssociations());
assertEquals(aspects.getList().get(4).getMandatoryAspects(), testAllAspect.getMandatoryAspects());
assertTrue(aspects.getList().get(5).getAssociations().isEmpty());
}
@Test
public void testSubAspects() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('mycompany:model'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(4));
otherParams.put("where", "(modelId in ('mycompany:model INCLUDESUBASPECTS'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(5));
otherParams.put("where", "(modelId in ('mycompany:model INCLUDESUBASPECTS') AND namespaceUri matches('http://www.test.*'))");
aspects = publicApiClient.aspects().getAspects(createParams(paging, otherParams));
aspects.getList().get(0).expected(smartFilterAspect);
assertEquals(aspects.getPaging().getTotalItems(), Integer.valueOf(1));
}
@Test
public void testAspectsById() throws PublicApiException
{
@@ -179,6 +268,11 @@ public class TestAspects extends AbstractBaseApiTest
aspect = publicApiClient.aspects().getAspect("mycompany:childAspect");
aspect.expected(childAspect);
aspect = publicApiClient.aspects().getAspect("test2:aspect-all");
assertEquals("mandatoryAspects not matched", aspect.getMandatoryAspects(), testAllAspect.getMandatoryAspects());
assertEquals("association not matched", aspect.getAssociations(), testAllAspect.getAssociations());
aspect.expected(testAllAspect);
}
@Test
@@ -187,12 +281,12 @@ public class TestAspects extends AbstractBaseApiTest
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
testListAspectException("(modelIds in ('mycompany:model','unknown:model','known:model'))");
testListAspectException("(modelIds in ('unknown:model','mycompany:model'))");
testListAspectException("(modelIds in (' ',' ',' ')");
testListAspectException("(parentIds in ('smf:smartFolder','unknown:aspect'))");
testListAspectException("(parentIds in ('unknown:aspect','smf:smartFolder'))");
testListAspectException("(parentIds in (' ',' ',' ')");
testListAspectException("(modelId in ('mycompany:model','unknown:model','known:model'))");
testListAspectException("(modelId in ('unknown:model','mycompany:model'))");
testListAspectException("(modelId in (' ',' ',' ')");
testListAspectException("(parentId in ('smf:smartFolder','unknown:aspect'))");
testListAspectException("(parentId in ('unknown:aspect','smf:smartFolder'))");
testListAspectException("(parentId in (' ',' ',' ')");
testListAspectException("(namespaceUri matches('*'))"); // wrong pattern
}
@@ -234,11 +328,4 @@ public class TestAspects extends AbstractBaseApiTest
assertEquals(HttpStatus.SC_BAD_REQUEST, e.getHttpResponse().getStatusCode());
}
}
@Override
public String getScope()
{
return "public";
}
}

View File

@@ -27,47 +27,21 @@
package org.alfresco.rest.api.tests;
import org.alfresco.repo.security.authentication.AuthenticationUtil;
import org.alfresco.rest.api.tests.client.PublicApiClient;
import org.alfresco.rest.api.tests.client.PublicApiException;
import org.alfresco.rest.api.tests.client.RequestContext;
import org.alfresco.rest.api.tests.client.data.Type;
import org.apache.commons.httpclient.HttpStatus;
import org.junit.Before;
import org.junit.Test;
import java.util.HashMap;
import java.util.Map;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertTrue;
import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertNull;
import static org.junit.Assert.fail;
public class TestTypes extends AbstractBaseApiTest
public class TestTypes extends BaseModelApiTest
{
private PublicApiClient.Paging paging = getPaging(0, 10);
PublicApiClient.ListResponse<org.alfresco.rest.api.tests.client.data.Type> types = null;
org.alfresco.rest.api.tests.client.data.Type type = null, whitePaperType = null, docType = null;
Map<String, String> otherParams = new HashMap<>();
@Before
public void setup() throws Exception
{
super.setup();
whitePaperType = new org.alfresco.rest.api.tests.client.data.Type();
whitePaperType.setId("mycompany:whitepaper");
whitePaperType.setTitle("whitepaper");
whitePaperType.setDescription("Whitepaper");
whitePaperType.setParentId("mycompany:doc");
docType = new org.alfresco.rest.api.tests.client.data.Type();
docType.setId("mycompany:doc");
docType.setTitle("doc");
docType.setDescription("Doc");
docType.setParentId("cm:content");
}
@Test
public void testAllTypes() throws PublicApiException
{
@@ -107,27 +81,27 @@ public class TestTypes extends AbstractBaseApiTest
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(parentIds in ('cm:content'))");
otherParams.put("where", "(parentId in ('cm:content'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
int total = types.getPaging().getTotalItems();
otherParams.put("where", "(parentIds in ('cm:content') AND namespaceUri matches('http://www.mycompany.com/model.*'))");
otherParams.put("where", "(parentId in ('cm:content') AND namespaceUri matches('http://www.mycompany.com/model.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
types.getList().get(0).expected(docType);
types.getList().get(1).expected(whitePaperType);
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(2));
otherParams.put("where", "(parentIds in ('cm:content') AND not namespaceUri matches('http://www.mycompany.com/model.*'))");
otherParams.put("where", "(parentId in ('cm:content') AND not namespaceUri matches('http://www.mycompany.com/model.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(total - 2));
// match everything
otherParams.put("where", "(parentIds in ('cm:content') AND namespaceUri matches('.*'))");
otherParams.put("where", "(parentId in ('cm:content') AND namespaceUri matches('.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(total));
// match nothing
otherParams.put("where", "(parentIds in ('cm:content') AND not namespaceUri matches('.*'))");
otherParams.put("where", "(parentId in ('cm:content') AND not namespaceUri matches('.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(0));
}
@@ -138,31 +112,156 @@ public class TestTypes extends AbstractBaseApiTest
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(3));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(4));
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan') AND namespaceUri matches('http://www.mycompany.com/model.*'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND namespaceUri matches('http://www.mycompany.com/model.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
types.getList().get(0).expected(docType);
types.getList().get(1).expected(whitePaperType);
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(2));
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan') AND not namespaceUri matches('http://www.mycompany.com/model.*'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND not namespaceUri matches('http://www.mycompany.com/model.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(1));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(2));
// match everything
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan') AND namespaceUri matches('.*'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND namespaceUri matches('.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(3));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(4));
// match nothing
otherParams.put("where", "(modelIds in ('mycompany:model','test:scan') AND not namespaceUri matches('.*'))");
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND not namespaceUri matches('.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(0));
}
@Test
public void testIncludeProperty() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('mycompany:model','test:scan'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(4));
assertNull(types.getList().get(0).getProperties());
assertNull(types.getList().get(1).getProperties());
assertNull(types.getList().get(2).getProperties());
otherParams.put("where", "(modelId in ('mycompany:model','test:scan') AND namespaceUri matches('http://www.mycompany.com/model.*'))");
otherParams.put("include", "properties");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
types.getList().get(0).expected(docType);
types.getList().get(1).expected(whitePaperType);
assertNotNull(types.getList().get(0).getProperties());
assertNotNull(types.getList().get(1).getProperties());
}
@Test
public void testIncludeAssociation() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('api:apiModel'))");
otherParams.put("include", "associations");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(10));
for (int i = 0; i < types.getList().size(); i++)
{
Type type = types.getList().get(i);
assertNotNull(type.getAssociations());
assertNull(type.getProperties());
assertNull(type.getMandatoryAspects());
type.expected(allTypes.get(i));
assertEquals(type.getAssociations(), allTypes.get(i).getAssociations());
}
}
@Test
public void testIncludeMandatoryAspect() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('api:apiModel'))");
otherParams.put("include", "mandatoryAspects");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
for (int i = 0; i < types.getList().size(); i++)
{
Type type = types.getList().get(i);
assertNotNull(type.getMandatoryAspects());
assertNull(type.getProperties());
assertNull(type.getAssociations());
type.expected(allTypes.get(i));
assertEquals(type.getMandatoryAspects(), allTypes.get(i).getMandatoryAspects());
}
}
@Test
public void testIncludes() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('api:apiModel'))");
otherParams.put("include", "associations,mandatoryAspects");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(10));
for (int i = 0; i < types.getList().size(); i++)
{
Type type = types.getList().get(i);
assertNotNull(type.getAssociations());
assertNull(type.getProperties());
assertNotNull(type.getMandatoryAspects());
type.expected(allTypes.get(i));
assertEquals(type.getMandatoryAspects(), allTypes.get(i).getMandatoryAspects());
assertEquals(type.getAssociations(), allTypes.get(i).getAssociations());
}
}
@Test
public void testSubTypes() throws PublicApiException
{
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
otherParams.put("where", "(modelId in ('mycompany:model'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(2));
types.getList().get(0).expected(docType);
types.getList().get(1).expected(whitePaperType);
otherParams.put("where", "(modelId in ('mycompany:model INCLUDESUBTYPES'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(3));
types.getList().get(0).expected(docType);
types.getList().get(1).expected(whitePaperType);
types.getList().get(2).expected(publishableType);
otherParams.put("where", "(modelId in ('mycompany:model INCLUDESUBTYPES') AND namespaceUri matches('http://www.test.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
assertEquals(types.getPaging().getTotalItems(), Integer.valueOf(1));
types.getList().get(0).expected(publishableType);
otherParams.put("where", "(modelId in ('mycompany:model INCLUDESUBTYPES') AND not namespaceUri matches('http://www.test.*'))");
types = publicApiClient.types().getTypes(createParams(paging, otherParams));
types.getList().get(0).expected(docType);
types.getList().get(1).expected(whitePaperType);
}
@Test
public void testTypesById() throws PublicApiException
{
@@ -171,6 +270,12 @@ public class TestTypes extends AbstractBaseApiTest
type = publicApiClient.types().getType("mycompany:whitepaper");
type.expected(whitePaperType);
type = publicApiClient.types().getType(apiBaseType.getId());
type.expected(apiBaseType);
assertNotNull(type.getProperties());
assertEquals(type.getMandatoryAspects(), apiBaseType.getMandatoryAspects());
assertEquals(type.getAssociations(), apiBaseType.getAssociations());
}
@Test
@@ -179,13 +284,13 @@ public class TestTypes extends AbstractBaseApiTest
AuthenticationUtil.setRunAsUser(user1);
publicApiClient.setRequestContext(new RequestContext(networkOne.getId(), user1));
testListTypeException("(modelIds in ('mycompany:model','unknown:model'))");
testListTypeException("(modelIds in ('unknown:model','unknown1:another'))");
testListTypeException("(modelIds=' , , ')");
testListTypeException("(parentIds in ('cm:content','unknown:type')");
testListTypeException("(parentIds in ('unknown:type','cm:content'))");
testListTypeException("(parentIds in ('unknown:type','unknown:types'))");
testListTypeException("(parentIds in (' ',' ',' '))");
testListTypeException("(modelId in ('mycompany:model','unknown:model'))");
testListTypeException("(modelId in ('unknown:model','unknown1:another'))");
testListTypeException("(modelId in (' ', '')");
testListTypeException("(parentId in ('cm:content','unknown:type')");
testListTypeException("(parentId in ('unknown:type','cm:content'))");
testListTypeException("(parentId in ('unknown:type','unknown:types'))");
testListTypeException("(parentId in (' ',' ',' '))");
testListTypeException("");
testListTypeException("(namespaceUri matches('*'))"); // wrong pattern
}
@@ -228,11 +333,4 @@ public class TestTypes extends AbstractBaseApiTest
assertEquals(HttpStatus.SC_BAD_REQUEST, e.getHttpResponse().getStatusCode());
}
}
@Override
public String getScope()
{
return "public";
}
}

View File

@@ -25,6 +25,9 @@
*/
package org.alfresco.rest.api.tests.client.data;
import org.alfresco.rest.api.model.Association;
import org.alfresco.rest.api.model.AssociationSource;
import org.alfresco.rest.api.model.Model;
import org.alfresco.rest.api.model.PropertyDefinition;
import org.alfresco.rest.api.tests.client.PublicApiClient;
import org.json.simple.JSONArray;
@@ -51,6 +54,17 @@ public class Aspect extends org.alfresco.rest.api.model.Aspect implements Serial
AssertUtil.assertEquals("title", getTitle(), other.getTitle());
AssertUtil.assertEquals("description", getDescription(), other.getDescription());
AssertUtil.assertEquals("parenId", getParentId(), other.getParentId());
AssertUtil.assertEquals("isArchive", getIsArchive(), other.getIsArchive());
AssertUtil.assertEquals("isContainer", getIsContainer(), other.getIsContainer());
AssertUtil.assertEquals("includedInSupertypeQuery", getIncludedInSupertypeQuery(), other.getIncludedInSupertypeQuery());
if (getModel() != null && other.getModel() != null)
{
AssertUtil.assertEquals("modelId", getModel().getId(), other.getModel().getId());
AssertUtil.assertEquals("author", getModel().getAuthor(), other.getModel().getAuthor());
AssertUtil.assertEquals("namespaceUri", getModel().getNamespaceUri(), other.getModel().getNamespaceUri());
AssertUtil.assertEquals("namespacePrefix", getModel().getNamespacePrefix(), other.getModel().getNamespacePrefix());
}
}
@SuppressWarnings("unchecked")
@@ -79,6 +93,36 @@ public class Aspect extends org.alfresco.rest.api.model.Aspect implements Serial
jsonObject.put("properties", getProperties());
}
if (getModel() != null)
{
jsonObject.put("model", getModel());
}
if (getMandatoryAspects() != null)
{
jsonObject.put("mandatoryAspects", getMandatoryAspects());
}
if (getIsContainer() != null)
{
jsonObject.put("isContainer", getIsContainer());
}
if (getIsArchive() != null)
{
jsonObject.put("isArchive", getIsArchive());
}
if (getIncludedInSupertypeQuery() != null)
{
jsonObject.put("includedInSupertypeQuery", getIncludedInSupertypeQuery());
}
if (getAssociations() != null)
{
jsonObject.put("associations", getAssociations());
}
return jsonObject;
}
@@ -90,15 +134,75 @@ public class Aspect extends org.alfresco.rest.api.model.Aspect implements Serial
String description = (String) jsonObject.get("description");
String parentId = (String) jsonObject.get("parentId");
List<PropertyDefinition> properties = (List<PropertyDefinition>) jsonObject.get("properties");
List<String> mandatoryAspects = jsonObject.get("mandatoryAspects") != null ? new ArrayList((List<String>)jsonObject.get("mandatoryAspects")) : null;
Boolean isContainer = (Boolean) jsonObject.get("isContainer");
Boolean isArchive = (Boolean) jsonObject.get("isArchive");
Boolean includedInSupertypeQuery = (Boolean) jsonObject.get("includedInSupertypeQuery");
Aspect action = new Aspect();
action.setId(id);
action.setTitle(title);
action.setDescription(description);
action.setParentId(parentId);
action.setProperties(properties);
List<Association> associations = null;
return action;
if (jsonObject.get("associations") != null)
{
associations = new ArrayList<>();
JSONArray jsonArray = (JSONArray) jsonObject.get("associations");
for(int i = 0; i < jsonArray.size(); i++)
{
Association association = new Association();
JSONObject object = (JSONObject) jsonArray.get(i);
association.setId((String) object.get("id"));
association.setTitle((String) object.get("title"));
association.setDescription((String) object.get("description"));
association.setIsChild((Boolean) object.get("child"));
association.setIsProtected((Boolean) object.get("isProtected"));
JSONObject sourceModel = (JSONObject) object.get("source");
if (sourceModel != null)
{
AssociationSource source = new AssociationSource();
source.setCls((String) sourceModel.get("cls"));
source.setRole((String) sourceModel.get("role"));
source.setIsMandatory((Boolean) sourceModel.get("isMandatory"));
source.setIsMany((Boolean) sourceModel.get("isMany"));
source.setIsMandatoryEnforced((Boolean) sourceModel.get("isMandatoryEnforced"));
association.setSource(source);
}
JSONObject targetModel = (JSONObject) object.get("target");
{
AssociationSource target = new AssociationSource();
target.setCls((String) targetModel.get("cls"));
target.setRole((String) targetModel.get("role"));
target.setIsMandatory((Boolean) targetModel.get("isMandatory"));
target.setIsMany((Boolean) targetModel.get("isMany"));
target.setIsMandatoryEnforced((Boolean) targetModel.get("isMandatoryEnforced"));
association.setTarget(target);
}
associations.add(association);
}
}
JSONObject jsonModel = (JSONObject) jsonObject.get("model");
Model model = new Model();
model.setId((String) jsonModel.get("id"));
model.setDescription((String) jsonModel.get("description"));
model.setNamespacePrefix((String) jsonModel.get("namespacePrefix"));
model.setNamespaceUri((String) jsonModel.get("namespaceUri"));
model.setAuthor((String) jsonModel.get("author"));
Aspect aspect = new Aspect();
aspect.setId(id);
aspect.setTitle(title);
aspect.setDescription(description);
aspect.setParentId(parentId);
aspect.setProperties(properties);
aspect.setMandatoryAspects(mandatoryAspects);
aspect.setIsContainer(isContainer);
aspect.setIsArchive(isArchive);
aspect.setIncludedInSupertypeQuery(includedInSupertypeQuery);
aspect.setAssociations(associations);
aspect.setModel(model);
return aspect;
}
@SuppressWarnings("unchecked")

View File

@@ -25,6 +25,9 @@
*/
package org.alfresco.rest.api.tests.client.data;
import org.alfresco.rest.api.model.Association;
import org.alfresco.rest.api.model.AssociationSource;
import org.alfresco.rest.api.model.Model;
import org.alfresco.rest.api.model.PropertyDefinition;
import org.alfresco.rest.api.tests.client.PublicApiClient;
import org.json.simple.JSONArray;
@@ -51,6 +54,17 @@ public class Type extends org.alfresco.rest.api.model.Type implements Serializab
AssertUtil.assertEquals("title", getTitle(), other.getTitle());
AssertUtil.assertEquals("description", getDescription(), other.getDescription());
AssertUtil.assertEquals("parenId", getParentId(), other.getParentId());
AssertUtil.assertEquals("isArchive", getIsArchive(), other.getIsArchive());
AssertUtil.assertEquals("isContainer", getIsContainer(), other.getIsContainer());
AssertUtil.assertEquals("includedInSupertypeQuery", getIncludedInSupertypeQuery(), other.getIncludedInSupertypeQuery());
if (getModel() != null && other.getModel() != null)
{
AssertUtil.assertEquals("modelId", getModel().getId(), other.getModel().getId());
AssertUtil.assertEquals("author", getModel().getAuthor(), other.getModel().getAuthor());
AssertUtil.assertEquals("namespaceUri", getModel().getNamespaceUri(), other.getModel().getNamespaceUri());
AssertUtil.assertEquals("namespacePrefix", getModel().getNamespacePrefix(), other.getModel().getNamespacePrefix());
}
}
@SuppressWarnings("unchecked")
@@ -79,6 +93,36 @@ public class Type extends org.alfresco.rest.api.model.Type implements Serializab
jsonObject.put("properties", getProperties());
}
if (getModel() != null)
{
jsonObject.put("model", getModel());
}
if (getMandatoryAspects() != null)
{
jsonObject.put("mandatoryAspects", getMandatoryAspects());
}
if (getIsContainer() != null)
{
jsonObject.put("isContainer", getIsContainer());
}
if (getIsArchive() != null)
{
jsonObject.put("isArchive", getIsArchive());
}
if (getIncludedInSupertypeQuery() != null)
{
jsonObject.put("includedInSupertypeQuery", getIncludedInSupertypeQuery());
}
if (getAssociations() != null)
{
jsonObject.put("associations", getAssociations());
}
return jsonObject;
}
@@ -90,15 +134,75 @@ public class Type extends org.alfresco.rest.api.model.Type implements Serializab
String description = (String) jsonObject.get("description");
String parentId = (String) jsonObject.get("parentId");
List<PropertyDefinition> properties = (List<PropertyDefinition>) jsonObject.get("properties");
List<String> mandatoryAspects = jsonObject.get("mandatoryAspects") != null ? new ArrayList((List<String>)jsonObject.get("mandatoryAspects")) : null;
Boolean isContainer = (Boolean) jsonObject.get("isContainer");
Boolean isArchive = (Boolean) jsonObject.get("isArchive");
Boolean includedInSupertypeQuery = (Boolean) jsonObject.get("includedInSupertypeQuery");
Type action = new Type();
action.setId(id);
action.setTitle(title);
action.setDescription(description);
action.setParentId(parentId);
action.setProperties(properties);
List<org.alfresco.rest.api.model.Association> associations = null;
return action;
if (jsonObject.get("associations") != null)
{
associations = new ArrayList<>();
JSONArray jsonArray = (JSONArray) jsonObject.get("associations");
for(int i = 0; i < jsonArray.size(); i++)
{
org.alfresco.rest.api.model.Association association = new Association();
JSONObject object = (JSONObject) jsonArray.get(i);
association.setId((String) object.get("id"));
association.setTitle((String) object.get("title"));
association.setDescription((String) object.get("description"));
association.setIsChild((Boolean) object.get("isChild"));
association.setIsProtected((Boolean) object.get("isProtected"));
JSONObject sourceModel = (JSONObject) object.get("source");
if (sourceModel != null)
{
AssociationSource source = new AssociationSource();
source.setCls((String) sourceModel.get("cls"));
source.setRole((String) sourceModel.get("role"));
source.setIsMandatory((Boolean) sourceModel.get("isMandatory"));
source.setIsMany((Boolean) sourceModel.get("isMany"));
source.setIsMandatoryEnforced((Boolean) sourceModel.get("isMandatoryEnforced"));
association.setSource(source);
}
JSONObject targetModel = (JSONObject) object.get("target");
{
AssociationSource target = new AssociationSource();
target.setCls((String) targetModel.get("cls"));
target.setRole((String) targetModel.get("role"));
target.setIsMandatory((Boolean) targetModel.get("isMandatory"));
target.setIsMany((Boolean) targetModel.get("isMany"));
target.setIsMandatoryEnforced((Boolean) targetModel.get("isMandatoryEnforced"));
association.setTarget(target);
}
associations.add(association);
}
}
JSONObject jsonModel = (JSONObject) jsonObject.get("model");
Model model = new Model();
model.setId((String) jsonModel.get("id"));
model.setDescription((String) jsonModel.get("description"));
model.setNamespacePrefix((String) jsonModel.get("namespacePrefix"));
model.setNamespaceUri((String) jsonModel.get("namespaceUri"));
model.setAuthor((String) jsonModel.get("author"));
Type type = new Type();
type.setId(id);
type.setTitle(title);
type.setDescription(description);
type.setParentId(parentId);
type.setProperties(properties);
type.setMandatoryAspects(mandatoryAspects);
type.setIsContainer(isContainer);
type.setIsArchive(isArchive);
type.setIncludedInSupertypeQuery(includedInSupertypeQuery);
type.setAssociations(associations);
type.setModel(model);
return type;
}
@SuppressWarnings("unchecked")

View File

@@ -97,6 +97,8 @@
<mandatory-aspects/>
</aspect>
<aspect name="mycompany:testAspect">
<title>Test Aspect</title>
<archive>true</archive>
<properties>
<property name="mycompany:testProperty">
<title>Test Property</title>

View File

@@ -0,0 +1,358 @@
<model name="api:apiModel" xmlns="http://www.alfresco.org/model/dictionary/1.0">
<author>Administrator</author>
<imports>
<import uri="http://www.alfresco.org/model/dictionary/1.0" prefix="d"/>
</imports>
<namespaces>
<namespace uri="http://www.api.t1/model/1.0" prefix="api"/>
<namespace uri="http://www.api.t2/model/1.0" prefix="test2"/>
</namespaces>
<constraints>
<constraint name="api:regex1" type="REGEX">
<title>Regex1 title</title>
<description>Regex1 description</description>
<parameter name="expression"><value>[A-Z]*</value></parameter>
<parameter name="requiresMatch"><value>false</value></parameter>
</constraint>
<constraint name="api:regex2" type="REGEX">
<parameter name="expression"><value>[a-z]*</value></parameter>
<parameter name="requiresMatch"><value>false</value></parameter>
</constraint>
<constraint name="api:stringLength1" type="LENGTH">
<parameter name="minLength"><value>0</value></parameter>
<parameter name="maxLength"><value>256</value></parameter>
</constraint>
<constraint name="api:stringLength2" type="LENGTH">
<parameter name="minLength"><value>0</value></parameter>
<parameter name="maxLength"><value>128</value></parameter>
</constraint>
<constraint name="api:minMax1" type="MINMAX">
<parameter name="minValue"><value>0</value></parameter>
<parameter name="maxValue"><value>256</value></parameter>
</constraint>
<constraint name="api:list1" type="LIST">
<title>List1 title</title>
<description>List1 description</description>
<parameter name="allowedValues">
<list>
<value>ABC</value>
<value>DEF</value>
<value>VALUE WITH SPACES</value>
<value>VALUE WITH TRAILING SPACE </value>
</list>
</parameter>
<parameter name="caseSensitive"><value>true</value></parameter>
</constraint>
<constraint name="api:list2" type="LIST">
<parameter name="allowedValues">
<list>
<value>HIJ</value>
</list>
</parameter>
<parameter name="caseSensitive"><value>true</value></parameter>
</constraint>
<constraint name="test2:list3" type="LIST">
<parameter name="allowedValues">
<list>
<value>XYZ</value>
</list>
</parameter>
<parameter name="caseSensitive"><value>true</value></parameter>
</constraint>
</constraints>
<types>
<type name="api:base">
<title>Base</title>
<description>The Base Type</description>
<properties>
<property name="api:prop1">
<type>d:text</type>
<protected>true</protected>
<default/>
<constraints>
<constraint ref="api:regex1"/>
<constraint ref="api:stringLength1">
<title>Prop1 Strlen1 title</title>
<description>Prop1 Strlen1 description</description>
</constraint>
</constraints>
</property>
</properties>
<associations>
<association name="api:assoc1">
<source>
<mandatory>true</mandatory>
<many>false</many>
</source>
<target>
<class>api:base</class>
<mandatory>false</mandatory>
<many>true</many>
</target>
</association>
<association name="api:assoc2">
<source>
<mandatory>true</mandatory>
<many>true</many>
</source>
<target>
<class>api:referenceable</class>
<mandatory>false</mandatory>
<many>false</many>
</target>
</association>
<child-association name="api:childassoc1">
<source>
<mandatory>true</mandatory>
<many>true</many>
</source>
<target>
<class>api:referenceable</class>
<mandatory>false</mandatory>
<many>false</many>
</target>
<child-name>fred</child-name>
<duplicate>true</duplicate>
</child-association>
<child-association name="api:childassocPropagate">
<source>
<mandatory>true</mandatory>
<many>true</many>
</source>
<target>
<class>api:referenceable</class>
<mandatory>false</mandatory>
<many>false</many>
</target>
<child-name>fred</child-name>
<duplicate>true</duplicate>
<propagateTimestamps>true</propagateTimestamps>
</child-association>
</associations>
<mandatory-aspects>
<aspect>api:referenceable</aspect>
</mandatory-aspects>
</type>
<type name="api:file">
<parent>api:base</parent>
<archive>true</archive>
<properties>
<property name="api:fileprop">
<type>d:text</type>
<protected>true</protected>
<default></default>
</property>
</properties>
<associations>
<child-association name="api:childassoc2">
<target>
<class>api:referenceable</class>
</target>
<child-name>fred</child-name>
<duplicate>true</duplicate>
</child-association>
</associations>
<overrides>
<property name="api:prop1">
<default>an overriden default value</default>
<constraints>
<constraint ref="api:stringLength2"/>
<constraint ref="api:regex2"/>
</constraints>
</property>
</overrides>
</type>
<type name="api:file-derived">
<parent>api:file</parent>
</type>
<type name="api:file-derived-no-archive">
<parent>api:file</parent>
<archive>false</archive>
</type>
<type name="api:folder">
<parent>api:base</parent>
<properties>
<property name="api:folderprop">
<type>d:text</type>
<protected>true</protected>
<default></default>
</property>
</properties>
</type>
<type name="api:enforced">
<parent>api:base</parent>
<properties>
<property name="api:mandatory-enforced">
<type>d:text</type>
<mandatory enforced="true">true</mandatory>
</property>
<property name="api:mandatory-not-enforced">
<type>d:text</type>
<mandatory enforced="false">true</mandatory>
</property>
<property name="api:mandatory-default-enforced">
<type>d:text</type>
<mandatory>true</mandatory>
</property>
</properties>
</type>
<type name="api:overridetype1">
<properties>
<property name="api:propoverride">
<type>d:text</type>
<default>one</default>
</property>
</properties>
</type>
<type name="api:overridetype2">
<parent>api:overridetype1</parent>
<overrides>
<property name="api:propoverride">
<default>two</default>
</property>
</overrides>
</type>
<type name="api:overridetype3">
<parent>api:overridetype2</parent>
<overrides>
<property name="api:propoverride">
<default>three</default>
</property>
</overrides>
</type>
<type name="api:typeWithNamedPropConstraint">
<title>Type with named property-defined constraint.</title>
<description>A type with a named constraint defined within one of its properties.</description>
<parent></parent>
<properties>
<property name="api:constrainedProp">
<type>d:text</type>
<protected>true</protected>
<default></default>
<constraints>
<constraint name="api:inlineConstraint" type="LIST">
<title>Inline constraint</title>
<description>An inline constraint</description>
<parameter name="allowedValues">
<list>
<value>ALPHA</value>
<value>BETA</value>
<value>GAMMA, DELTA</value>
<value>OMEGA</value>
</list>
</parameter>
<parameter name="caseSensitive"><value>true</value></parameter>
</constraint>
</constraints>
</property>
</properties>
</type>
</types>
<aspects>
<aspect name="api:referenceable">
<title>Referenceable</title>
<description>The referenceable aspect</description>
<parent></parent>
<properties>
<property name="api:id">
<type>d:int</type>
<protected>true</protected>
<mandatory>true</mandatory>
<constraints>
<constraint ref="api:minMax1"/>
</constraints>
</property>
</properties>
</aspect>
<aspect name="api:aspect-base">
<title>Aspect Base</title>
<parent></parent>
<properties>
<property name="api:aspect-base-p1">
<type>d:text</type>
<constraints>
<constraint ref="api:list1"/>
</constraints>
</property>
</properties>
</aspect>
<aspect name="api:aspect-one">
<title>Aspect One</title>
<parent>api:aspect-base</parent>
<overrides>
<property name="api:aspect-base-p1">
<constraints>
<constraint ref="api:list2"/>
</constraints>
</property>
</overrides>
</aspect>
<aspect name="api:aspect-two">
<title>Aspect Two</title>
<parent>api:aspect-base</parent>
<overrides>
<property name="api:aspect-base-p1">
<constraints>
<constraint ref="api:list1"/>
<constraint ref="api:list2"/>
</constraints>
</property>
</overrides>
</aspect>
<aspect name="test2:aspect-three">
<title>Aspect derived from other namespace</title>
<parent>api:aspect-base</parent>
<overrides>
<property name="api:aspect-base-p1">
<constraints>
<constraint ref="test2:list3"/>
</constraints>
</property>
</overrides>
</aspect>
<aspect name="test2:aspect-all">
<title>Aspect derived from other namespace</title>
<archive>false</archive>
<includedInSuperTypeQuery>false</includedInSuperTypeQuery>
<associations>
<association name="api:assoc-all">
<source>
<mandatory>true</mandatory>
<many>true</many>
</source>
<target>
<class>api:referenceable</class>
<mandatory>false</mandatory>
<many>false</many>
</target>
</association>
</associations>
<mandatory-aspects>
<aspect>test2:aspect-three</aspect>
<aspect>api:aspect-two</aspect>
<aspect>api:aspect-one</aspect>
</mandatory-aspects>
</aspect>
</aspects>
</model>

View File

@@ -5,6 +5,7 @@
<import uri="http://www.alfresco.org/model/content/1.0" prefix="cm"/>
<import uri="http://www.alfresco.org/model/dictionary/1.0" prefix="d"/>
<import uri="http://www.alfresco.org/model/content/smartfolder/1.0" prefix="smf"/>
<import uri="http://www.mycompany.com/model/finance/1.0" prefix="mycompany"/>
</imports>
<namespaces>
<namespace uri="http://www.test.com/model/account/1.0" prefix="test"/>
@@ -49,6 +50,9 @@
<overrides/>
<mandatory-aspects/>
</type>
<type name="test:publishable">
<parent>mycompany:doc</parent>
</type>
</types>
<aspects>
<aspect name="test:rescan">
@@ -74,7 +78,7 @@
<aspect name="test:smartFilter">
<title>Smart filter</title>
<description>Smart Filter</description>
<parent>cm:auditable</parent>
<parent>mycompany:testAspect</parent>
<properties/>
<associations/>
<overrides/>

View File

@@ -18,6 +18,7 @@
<value>models/people-api.xml</value>
<value>models/mycompany-model.xml</value>
<value>models/test-scan.xml</value>
<value>models/test-api-model.xml</value>
</list>
</property>
</bean>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>repo-5439v2-c1</version>
<version>11.13</version>
</parent>
<dependencies>
@@ -82,7 +82,7 @@
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.11</version>
<version>3.12.0</version>
</dependency>
<dependency>
<groupId>commons-codec</groupId>
@@ -191,7 +191,7 @@
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-artifact</artifactId>
<version>3.6.3</version>
<version>3.8.1</version>
</dependency>
<dependency>
<groupId>de.schlichtherle.truezip</groupId>
@@ -383,7 +383,7 @@
<dependency>
<groupId>com.fasterxml.woodstox</groupId>
<artifactId>woodstox-core</artifactId>
<version>6.2.4</version>
<version>6.2.6</version>
</dependency>
<!-- GData -->
@@ -687,6 +687,10 @@
<groupId>org.apache.camel</groupId>
<artifactId>camel-direct</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-management</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-mock</artifactId>
@@ -806,7 +810,7 @@
<dependency>
<groupId>commons-net</groupId>
<artifactId>commons-net</artifactId>
<version>3.7.2</version>
<version>3.8.0</version>
<scope>test</scope>
</dependency>
<dependency>
@@ -861,6 +865,12 @@
<version>${dependency.awaitility.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.12</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>

View File

@@ -40,6 +40,7 @@ public class IdsEntity
private Long idThree;
private Long idFour;
private List<Long> ids;
private boolean ordered;
public Long getIdOne()
{
return idOne;
@@ -80,4 +81,12 @@ public class IdsEntity
{
this.ids = ids;
}
public boolean isOrdered()
{
return ordered;
}
public void setOrdered(boolean ordered)
{
this.ordered = ordered;
}
}

View File

@@ -563,6 +563,7 @@ public abstract class AbstractNodeDAOImpl implements NodeDAO, BatchingDAO
Long txnId = txn.getId();
// Update it
Long now = System.currentTimeMillis();
txn.setCommitTimeMs(now);
updateTransaction(txnId, now);
}
}
@@ -604,6 +605,17 @@ public abstract class AbstractNodeDAOImpl implements NodeDAO, BatchingDAO
return txn;
}
public Long getCurrentTransactionCommitTime()
{
Long commitTime = null;
TransactionEntity resource = AlfrescoTransactionSupport.getResource(KEY_TRANSACTION);
if(resource != null)
{
commitTime = resource.getCommitTimeMs();
}
return commitTime;
}
public Long getCurrentTransactionId(boolean ensureNew)
{
TransactionEntity txn;
@@ -1483,7 +1495,17 @@ public abstract class AbstractNodeDAOImpl implements NodeDAO, BatchingDAO
// Update ACLs for moved tree
Long newParentAclId = newParentNode.getAclId();
accessControlListDAO.updateInheritance(newChildNodeId, oldParentAclId, newParentAclId);
// Verify if parent has aspect applied and ACL's are pending
if (hasNodeAspect(oldParentNodeId, ContentModel.ASPECT_PENDING_FIX_ACL))
{
Long oldParentSharedAclId = (Long) this.getNodeProperty(oldParentNodeId, ContentModel.PROP_SHARED_ACL_TO_REPLACE);
accessControlListDAO.updateInheritance(newChildNodeId, oldParentSharedAclId, newParentAclId);
}
else
{
accessControlListDAO.updateInheritance(newChildNodeId, oldParentAclId, newParentAclId);
}
}
// Done
@@ -2746,6 +2768,22 @@ public abstract class AbstractNodeDAOImpl implements NodeDAO, BatchingDAO
selectNodesWithAspects(qnameIds, minNodeId, maxNodeId, resultsCallback);
}
@Override
public void getNodesWithAspects(
Set<QName> aspectQNames,
Long minNodeId, Long maxNodeId, boolean ordered,
NodeRefQueryCallback resultsCallback)
{
Set<Long> qnameIdsSet = qnameDAO.convertQNamesToIds(aspectQNames, false);
if (qnameIdsSet.size() == 0)
{
// No point running a query
return;
}
List<Long> qnameIds = new ArrayList<Long>(qnameIdsSet);
selectNodesWithAspects(qnameIds, minNodeId, maxNodeId, ordered, resultsCallback);
}
/**
* @return Returns a writable copy of the cached aspects set
*/
@@ -4917,6 +4955,10 @@ public abstract class AbstractNodeDAOImpl implements NodeDAO, BatchingDAO
List<Long> qnameIds,
Long minNodeId, Long maxNodeId,
NodeRefQueryCallback resultsCallback);
protected abstract void selectNodesWithAspects(
List<Long> qnameIds,
Long minNodeId, Long maxNodeId, boolean ordered,
NodeRefQueryCallback resultsCallback);
protected abstract Long insertNodeAssoc(Long sourceNodeId, Long targetNodeId, Long assocTypeQNameId, int assocIndex);
protected abstract int updateNodeAssoc(Long id, int assocIndex);
protected abstract int deleteNodeAssoc(Long sourceNodeId, Long targetNodeId, Long assocTypeQNameId);

View File

@@ -75,6 +75,13 @@ public interface NodeDAO extends NodeBulkLoader
/*
* Transaction
*/
/**
* @return the commit time of the current transaction entry or <tt>null</tt> if
* there have not been any modifications to nodes registered in the
* transaction.
*/
Long getCurrentTransactionCommitTime();
/**
* @param ensureNew <tt>true</tt> to ensure that a new transaction entry is created
@@ -405,6 +412,20 @@ public interface NodeDAO extends NodeBulkLoader
Long minNodeId, Long maxNodeId,
NodeRefQueryCallback resultsCallback);
/**
* Get nodes with aspects between the given ranges, ordering the results optionally
*
* @param aspectQNames the aspects that must be on the nodes
* @param minNodeId the minimum node ID (inclusive)
* @param maxNodeId the maximum node ID (exclusive)
* @param ordered if the results are to be ordered by nodeID
* @param resultsCallback callback to process results
*/
public void getNodesWithAspects(
Set<QName> aspectQNames,
Long minNodeId, Long maxNodeId, boolean ordered,
NodeRefQueryCallback resultsCallback);
/*
* Node Assocs
*/
@@ -913,14 +934,14 @@ public interface NodeDAO extends NodeBulkLoader
* @param toNodeId Final node id
* @return maximum commit time
*/
public Long getMaxTxInNodeIdRange(Long fromNodeId, Long toNodeId);
/**
* Gets the next commit time from [fromCommitTime]
*
* @param fromCommitTime Initial commit time
* @return next commit time
*/
public Long getMaxTxInNodeIdRange(Long fromNodeId, Long toNodeId);
/**
* Gets the next commit time from [fromCommitTime]
*
* @param fromCommitTime Initial commit time
* @return next commit time
*/
public Long getNextTxCommitTime(Long fromCommitTime);
}

View File

@@ -427,10 +427,6 @@ public class NodeDAOImpl extends AbstractNodeDAOImpl
NodeEntity node = new NodeEntity();
node.setId(id);
if (logger.isDebugEnabled())
{
logger.debug("+ Read node with id: "+id);
}
return template.selectOne(SELECT_NODE_BY_ID, node);
}
@@ -454,10 +450,6 @@ public class NodeDAOImpl extends AbstractNodeDAOImpl
}
node.setUuid(uuid);
if (logger.isDebugEnabled())
{
logger.debug("+ Read node with uuid: "+uuid);
}
return template.selectOne(SELECT_NODE_BY_NODEREF, node);
}
@@ -772,6 +764,31 @@ public class NodeDAOImpl extends AbstractNodeDAOImpl
template.select(SELECT_NODES_WITH_ASPECT_IDS, parameters, resultHandler);
}
@Override
protected void selectNodesWithAspects(
List<Long> qnameIds,
Long minNodeId, Long maxNodeId, boolean ordered,
final NodeRefQueryCallback resultsCallback)
{
@SuppressWarnings("rawtypes")
ResultHandler resultHandler = new ResultHandler()
{
public void handleResult(ResultContext context)
{
NodeEntity entity = (NodeEntity) context.getResultObject();
Pair<Long, NodeRef> nodePair = new Pair<Long, NodeRef>(entity.getId(), entity.getNodeRef());
resultsCallback.handle(nodePair);
}
};
IdsEntity parameters = new IdsEntity();
parameters.setIdOne(minNodeId);
parameters.setIdTwo(maxNodeId);
parameters.setIds(qnameIds);
parameters.setOrdered(ordered);
template.select(SELECT_NODES_WITH_ASPECT_IDS, parameters, resultHandler);
}
@Override
protected Long insertNodeAssoc(Long sourceNodeId, Long targetNodeId, Long assocTypeQNameId, int assocIndex)
{

View File

@@ -337,6 +337,13 @@ public class ADMAccessControlListDAO implements AccessControlListDAO
setFixedAcls(getNodeIdNotNull(parent), inheritFrom, null, sharedAclToReplace, changes, false, asyncCall, true);
return changes;
}
public List<AclChange> setInheritanceForChildren(NodeRef parent, Long inheritFrom, Long sharedAclToReplace, boolean asyncCall, boolean forceSharedACL)
{
List<AclChange> changes = new ArrayList<AclChange>();
setFixedAcls(getNodeIdNotNull(parent), inheritFrom, null, sharedAclToReplace, changes, false, asyncCall, true, forceSharedACL);
return changes;
}
public void updateChangedAcls(NodeRef startingPoint, List<AclChange> changes)
{
@@ -362,6 +369,29 @@ public class ADMAccessControlListDAO implements AccessControlListDAO
setFixedAcls(nodeId, inheritFrom, mergeFrom, sharedAclToReplace, changes, set, false, true);
}
/**
* Support to set a shared ACL on a node and all of its children
*
* @param nodeId
* the parent node
* @param inheritFrom
* the parent node's ACL
* @param mergeFrom
* the shared ACL, if already known. If <code>null</code>, will be retrieved / created lazily
* @param changes
* the list in which to record changes
* @param set
* set the shared ACL on the parent ?
* @param asyncCall
* function may require asynchronous call depending the execution time; if time exceeds configured <code>fixedAclMaxTransactionTime</code> value,
* recursion is stopped using propagateOnChildren parameter(set on false) and those nodes for which the method execution was not finished
* in the classical way, will have ASPECT_PENDING_FIX_ACL, which will be used in {@link FixedAclUpdater} for later processing
*/
public void setFixedAcls(Long nodeId, Long inheritFrom, Long mergeFrom, Long sharedAclToReplace, List<AclChange> changes, boolean set, boolean asyncCall, boolean propagateOnChildren)
{
setFixedAcls(nodeId, inheritFrom, mergeFrom, sharedAclToReplace, changes, set, false, true, false);
}
/**
* Support to set a shared ACL on a node and all of its children
*
@@ -379,8 +409,10 @@ public class ADMAccessControlListDAO implements AccessControlListDAO
* function may require asynchronous call depending the execution time; if time exceeds configured <code>fixedAclMaxTransactionTime</code> value,
* recursion is stopped using propagateOnChildren parameter(set on false) and those nodes for which the method execution was not finished
* in the classical way, will have ASPECT_PENDING_FIX_ACL, which will be used in {@link FixedAclUpdater} for later processing
* @param forceSharedACL
* When a child node has an unexpected ACL, force it to assume the new shared ACL instead of throwing a concurrency exception.
*/
public void setFixedAcls(Long nodeId, Long inheritFrom, Long mergeFrom, Long sharedAclToReplace, List<AclChange> changes, boolean set, boolean asyncCall, boolean propagateOnChildren)
public void setFixedAcls(Long nodeId, Long inheritFrom, Long mergeFrom, Long sharedAclToReplace, List<AclChange> changes, boolean set, boolean asyncCall, boolean propagateOnChildren, boolean forceSharedACL)
{
if (log.isDebugEnabled())
{
@@ -431,14 +463,14 @@ public class ADMAccessControlListDAO implements AccessControlListDAO
if (acl == null)
{
propagateOnChildren = setFixAclPending(child.getId(), inheritFrom, mergeFrom, sharedAclToReplace, changes, false, asyncCall, propagateOnChildren);
propagateOnChildren = setFixAclPending(child.getId(), inheritFrom, mergeFrom, sharedAclToReplace, changes, false, asyncCall, propagateOnChildren, forceSharedACL);
}
else
{
// Still has old shared ACL or already replaced
if(acl.equals(sharedAclToReplace) || acl.equals(mergeFrom) || acl.equals(currentAcl))
{
propagateOnChildren = setFixAclPending(child.getId(), inheritFrom, mergeFrom, sharedAclToReplace, changes, false, asyncCall, propagateOnChildren);
propagateOnChildren = setFixAclPending(child.getId(), inheritFrom, mergeFrom, sharedAclToReplace, changes, false, asyncCall, propagateOnChildren, forceSharedACL);
}
else
{
@@ -457,7 +489,20 @@ public class ADMAccessControlListDAO implements AccessControlListDAO
}
else if (dbAcl.getAclType() == ACLType.SHARED)
{
throw new ConcurrencyFailureException("setFixedAcls: unexpected shared acl: "+dbAcl);
if (forceSharedACL)
{
log.warn("Forcing shared ACL on node: " + child.getId() + " ( "
+ nodeDAO.getNodePair(child.getId()).getSecond() + ") - " + dbAcl);
sharedAclToReplace = acl;
propagateOnChildren = setFixAclPending(child.getId(), inheritFrom, mergeFrom, sharedAclToReplace,
changes, false, asyncCall, propagateOnChildren, forceSharedACL);
}
else
{
throw new ConcurrencyFailureException(
"setFixedAcls: unexpected shared acl: " + dbAcl + " on node " + child.getId() + " ( "
+ nodeDAO.getNodePair(child.getId()).getSecond() + ")");
}
}
}
}
@@ -506,7 +551,7 @@ public class ADMAccessControlListDAO implements AccessControlListDAO
*
*/
private boolean setFixAclPending(Long nodeId, Long inheritFrom, Long mergeFrom, Long sharedAclToReplace,
List<AclChange> changes, boolean set, boolean asyncCall, boolean propagateOnChildren)
List<AclChange> changes, boolean set, boolean asyncCall, boolean propagateOnChildren, boolean forceSharedACL)
{
// check transaction time
long transactionStartTime = AlfrescoTransactionSupport.getTransactionStartTime();
@@ -514,7 +559,7 @@ public class ADMAccessControlListDAO implements AccessControlListDAO
if (transactionTime < fixedAclMaxTransactionTime)
{
// make regular method call if time is under max transaction configured time
setFixedAcls(nodeId, inheritFrom, mergeFrom, sharedAclToReplace, changes, set, asyncCall, propagateOnChildren);
setFixedAcls(nodeId, inheritFrom, mergeFrom, sharedAclToReplace, changes, set, asyncCall, propagateOnChildren, forceSharedACL);
return true;
}

View File

@@ -91,6 +91,11 @@ public interface AccessControlListDAO
*/
public List<AclChange> setInheritanceForChildren(NodeRef parent, Long inheritFrom, Long sharedAclToReplace, boolean asyncCall);
/**
* Set the inheritance on a given node and it's children. If an unexpected ACL occurs in a child, it can be overriden by setting forceSharedACL
*/
public List<AclChange> setInheritanceForChildren(NodeRef parent, Long inheritFrom, Long sharedAclToReplace, boolean asyncCall, boolean forceSharedACL);
public Long getIndirectAcl(NodeRef nodeRef);
public Long getInheritedAcl(NodeRef nodeRef);

View File

@@ -1,28 +1,28 @@
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.domain.permissions;
import java.util.List;
@@ -182,4 +182,11 @@ public interface AclDAO
* @return Long
*/
public Long getMaxChangeSetIdByCommitTime(long maxCommitTime);
/**
* @return the commit time of the current ACL change set entry or <tt>null</tt> if
* there have not been any modifications.
*/
public Long getCurrentChangeSetCommitTime();
}

View File

@@ -1,28 +1,28 @@
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.domain.permissions;
import java.io.Serializable;
@@ -1637,6 +1637,7 @@ public class AclDAOImpl implements AclDAO
}
private static final String RESOURCE_KEY_ACL_CHANGE_SET_ID = "acl.change.set.id";
private static final String RESOURCE_KEY_ACL_CHANGE_SET_COMMIT_TIME_MS = "acl.change.commit.set.time.ms";
private UpdateChangeSetListener updateChangeSetListener = new UpdateChangeSetListener();
/**
@@ -1662,9 +1663,17 @@ public class AclDAOImpl implements AclDAO
}
// Update it
long commitTimeMs = System.currentTimeMillis();
AlfrescoTransactionSupport.bindResource(RESOURCE_KEY_ACL_CHANGE_SET_COMMIT_TIME_MS, commitTimeMs);
aclCrudDAO.updateAclChangeSet(changeSetId, commitTimeMs);
}
}
@Override
public Long getCurrentChangeSetCommitTime()
{
return AlfrescoTransactionSupport.getResource(RESOURCE_KEY_ACL_CHANGE_SET_COMMIT_TIME_MS);
}
/**
* Support to get the current ACL change set and bind this to the transaction. So we only make one new version of an
* ACL per change set. If something is in the current change set we can update it.

View File

@@ -35,9 +35,12 @@ import java.util.List;
import java.util.Set;
import java.util.concurrent.atomic.AtomicBoolean;
import com.google.common.collect.Sets;
import org.alfresco.model.ContentModel;
import org.alfresco.repo.batch.BatchProcessWorkProvider;
import org.alfresco.repo.batch.BatchProcessor;
import org.alfresco.repo.batch.BatchProcessor.BatchProcessWorker;
import org.alfresco.repo.domain.node.NodeDAO;
import org.alfresco.repo.domain.node.NodeDAO.NodeRefQueryCallback;
import org.alfresco.repo.lock.JobLockService;
@@ -50,6 +53,7 @@ import org.alfresco.repo.security.authentication.AuthenticationUtil.RunAsWork;
import org.alfresco.repo.security.permissions.PermissionServicePolicies;
import org.alfresco.repo.security.permissions.PermissionServicePolicies.OnInheritPermissionsDisabled;
import org.alfresco.repo.transaction.AlfrescoTransactionSupport;
import org.alfresco.repo.transaction.RetryingTransactionHelper;
import org.alfresco.repo.transaction.RetryingTransactionHelper.RetryingTransactionCallback;
import org.alfresco.repo.transaction.TransactionListenerAdapter;
import org.alfresco.service.cmr.repository.NodeRef;
@@ -64,6 +68,8 @@ import org.apache.commons.logging.LogFactory;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.dao.ConcurrencyFailureException;
/**
* Finds nodes with ASPECT_PENDING_FIX_ACL aspect and sets fixed ACLs for them
@@ -79,18 +85,22 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
public static final String FIXED_ACL_ASYNC_REQUIRED_KEY = "FIXED_ACL_ASYNC_REQUIRED";
public static final String FIXED_ACL_ASYNC_CALL_KEY = "FIXED_ACL_ASYNC_CALL";
protected static final QName LOCK_Q_NAME = QName.createQName(NamespaceService.SYSTEM_MODEL_1_0_URI, "FixedAclUpdater");
/** A set of listeners to receive callback events whenever permissions are updated by this class. */
private static Set<FixedAclUpdaterListener> listeners = Sets.newConcurrentHashSet();
private ApplicationContext applicationContext;
private JobLockService jobLockService;
private TransactionService transactionService;
private AccessControlListDAO accessControlListDAO;
private NodeDAO nodeDAO;
private QName lockQName = QName.createQName(NamespaceService.SYSTEM_MODEL_1_0_URI, "FixedAclUpdater");
private long lockTimeToLive = 10000;
private long lockRefreshTime = lockTimeToLive / 2;
private int maxItemBatchSize = 100;
private int numThreads = 4;
private boolean forceSharedACL = false;
private ClassPolicyDelegate<OnInheritPermissionsDisabled> onInheritPermissionsDisabledDelegate;
private PolicyComponent policyComponent;
@@ -132,6 +142,11 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
this.maxItemBatchSize = maxItemBatchSize;
}
public void setForceSharedACL(boolean forceSharedACL)
{
this.forceSharedACL = forceSharedACL;
}
public void setLockTimeToLive(long lockTimeToLive)
{
this.lockTimeToLive = lockTimeToLive;
@@ -148,6 +163,12 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
this.policyIgnoreUtil = policyIgnoreUtil;
}
/** Register a {@link FixedAclUpdaterListener} to be notified when a node is updated by an instance of this class. */
public static void registerListener(FixedAclUpdaterListener listener)
{
listeners.add(listener);
}
public void init()
{
onInheritPermissionsDisabledDelegate = policyComponent
@@ -182,7 +203,7 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
public List<NodeRef> execute() throws Throwable
{
getNodesCallback.init();
nodeDAO.getNodesWithAspects(aspects, getNodesCallback.getMinNodeId(), null, getNodesCallback);
nodeDAO.getNodesWithAspects(aspects, getNodesCallback.getMinNodeId(), null, true, getNodesCallback);
getNodesCallback.done();
return getNodesCallback.getNodes();
@@ -231,7 +252,7 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
}
}
private class AclWorker implements BatchProcessor.BatchProcessWorker<NodeRef>
protected class AclWorker implements BatchProcessor.BatchProcessWorker<NodeRef>
{
private Set<QName> aspects = new HashSet<>(1);
@@ -253,7 +274,7 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
{
}
public void process(final NodeRef nodeRef) throws Throwable
public void process(final NodeRef nodeRef)
{
RunAsWork<Void> findAndUpdateAclRunAsWork = new RunAsWork<Void>()
{
@@ -265,36 +286,48 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
log.debug(String.format("Processing node %s", nodeRef));
}
final Long nodeId = nodeDAO.getNodePair(nodeRef).getFirst();
// MNT-22009 - If node was deleted and in archive store, remove the aspect and properties and do not
// process
if (nodeRef.getStoreRef().equals(StoreRef.STORE_REF_ARCHIVE_SPACESSTORE))
try
{
final Long nodeId = nodeDAO.getNodePair(nodeRef).getFirst();
// MNT-22009 - If node was deleted and in archive store, remove the aspect and properties and do
// not
// process
if (nodeRef.getStoreRef().equals(StoreRef.STORE_REF_ARCHIVE_SPACESSTORE))
{
accessControlListDAO.removePendingAclAspect(nodeId);
return null;
}
// retrieve acl properties from node
Long inheritFrom = (Long) nodeDAO.getNodeProperty(nodeId, ContentModel.PROP_INHERIT_FROM_ACL);
Long sharedAclToReplace = (Long) nodeDAO.getNodeProperty(nodeId, ContentModel.PROP_SHARED_ACL_TO_REPLACE);
// set inheritance using retrieved prop
accessControlListDAO.setInheritanceForChildren(nodeRef, inheritFrom, sharedAclToReplace, true,
forceSharedACL);
// Remove aspect
accessControlListDAO.removePendingAclAspect(nodeId);
return null;
if (!policyIgnoreUtil.ignorePolicy(nodeRef))
{
boolean transformedToAsyncOperation = toBoolean((Boolean) AlfrescoTransactionSupport
.getResource(FixedAclUpdater.FIXED_ACL_ASYNC_REQUIRED_KEY));
OnInheritPermissionsDisabled onInheritPermissionsDisabledPolicy = onInheritPermissionsDisabledDelegate
.get(ContentModel.TYPE_BASE);
onInheritPermissionsDisabledPolicy.onInheritPermissionsDisabled(nodeRef, transformedToAsyncOperation);
}
}
// retrieve acl properties from node
Long inheritFrom = (Long) nodeDAO.getNodeProperty(nodeId, ContentModel.PROP_INHERIT_FROM_ACL);
Long sharedAclToReplace = (Long) nodeDAO.getNodeProperty(nodeId, ContentModel.PROP_SHARED_ACL_TO_REPLACE);
// set inheritance using retrieved prop
accessControlListDAO.setInheritanceForChildren(nodeRef, inheritFrom, sharedAclToReplace, true);
// Remove aspect
accessControlListDAO.removePendingAclAspect(nodeId);
if (!policyIgnoreUtil.ignorePolicy(nodeRef))
catch (Exception e)
{
boolean transformedToAsyncOperation = toBoolean(
(Boolean) AlfrescoTransactionSupport.getResource(FixedAclUpdater.FIXED_ACL_ASYNC_REQUIRED_KEY));
OnInheritPermissionsDisabled onInheritPermissionsDisabledPolicy = onInheritPermissionsDisabledDelegate
.get(ContentModel.TYPE_BASE);
onInheritPermissionsDisabledPolicy.onInheritPermissionsDisabled(nodeRef, transformedToAsyncOperation);
log.error("Job could not process pending ACL node " + nodeRef + ": " + e);
e.printStackTrace();
}
listeners.forEach(listener -> listener.permissionsUpdatedAsynchronously(nodeRef));
if (log.isDebugEnabled())
{
log.debug(String.format("Node processed %s", nodeRef));
@@ -308,8 +341,15 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
AuthenticationUtil.runAs(findAndUpdateAclRunAsWork, AuthenticationUtil.getSystemUserName());
}
};
private class GetNodesWithAspectCallback implements NodeRefQueryCallback
/** Create a new AclWorker. */
protected AclWorker createAclWorker()
{
return new AclWorker();
}
class GetNodesWithAspectCallback implements NodeRefQueryCallback
{
private List<NodeRef> nodes = new ArrayList<>();
private long minNodeId;
@@ -400,11 +440,11 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
try
{
lockToken = jobLockService.getLock(lockQName, lockTimeToLive, 0, 1);
jobLockService.refreshLock(lockToken, lockQName, lockRefreshTime, jobLockRefreshCallback);
lockToken = jobLockService.getLock(LOCK_Q_NAME, lockTimeToLive, 0, 1);
jobLockService.refreshLock(lockToken, LOCK_Q_NAME, lockRefreshTime, jobLockRefreshCallback);
AclWorkProvider provider = new AclWorkProvider();
AclWorker worker = new AclWorker();
AclWorker worker = createAclWorker();
BatchProcessor<NodeRef> bp = new BatchProcessor<>("FixedAclUpdater",
transactionService.getRetryingTransactionHelper(), provider, numThreads, maxItemBatchSize, applicationContext,
log, 100);
@@ -421,7 +461,7 @@ public class FixedAclUpdater extends TransactionListenerAdapter implements Appli
jobLockRefreshCallback.isActive.set(false);
if (lockToken != null)
{
jobLockService.releaseLock(lockToken, lockQName);
jobLockService.releaseLock(lockToken, LOCK_Q_NAME);
}
}
}

View File

@@ -1,38 +1,35 @@
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.search.impl;
/**
* Json returned from Solr
*
* @author Gethin James
* @since 5.0
*/
public interface JSONResult
{
public Long getQueryTime();
public long getNumberFound();
}
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.domain.permissions;
import org.alfresco.service.cmr.repository.NodeRef;
/** Listener to receive callback events when permissions are asynchronously updated with the {@link FixedAclUpdater}. */
public interface FixedAclUpdaterListener
{
/** Callback method for when permissions have been updated by the FixedAclUpdater. */
void permissionsUpdatedAsynchronously(NodeRef nodeRef);
}

View File

@@ -416,11 +416,13 @@ public class EventConsolidator implements EventSupportedPolicies
}
// Get before values that changed
Map<K, V> beforeDelta = new HashMap<>(before);
Map<K, V> afterDelta = new HashMap<>(after);
beforeDelta.entrySet().removeAll(after.entrySet());
// Add nulls for before properties
Set<K> beforeKeys = before.keySet();
Set<K> newKeys = after.keySet();
Set<K> newKeys = afterDelta.keySet();
newKeys.removeAll(beforeKeys);
for (K key : newKeys)

View File

@@ -27,12 +27,16 @@ package org.alfresco.repo.event2;
import java.io.Serializable;
import java.net.URI;
import java.time.Instant;
import java.time.ZoneOffset;
import java.time.ZonedDateTime;
import java.util.Deque;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.UUID;
import org.alfresco.repo.domain.node.NodeDAO;
import org.alfresco.repo.domain.node.TransactionEntity;
import org.alfresco.repo.event.v1.model.EventType;
import org.alfresco.repo.event.v1.model.RepoEvent;
import org.alfresco.repo.event2.filter.ChildAssociationTypeFilter;
@@ -54,7 +58,6 @@ import org.alfresco.repo.policy.JavaBehaviour;
import org.alfresco.repo.policy.PolicyComponent;
import org.alfresco.repo.security.authentication.AuthenticationUtil;
import org.alfresco.repo.transaction.AlfrescoTransactionSupport;
import org.alfresco.repo.transaction.RetryingTransactionHelper.RetryingTransactionCallback;
import org.alfresco.service.cmr.dictionary.DictionaryService;
import org.alfresco.service.cmr.repository.AssociationRef;
import org.alfresco.service.cmr.repository.ChildAssociationRef;
@@ -90,11 +93,12 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
protected DictionaryService dictionaryService;
private DescriptorService descriptorService;
private EventFilterRegistry eventFilterRegistry;
private Event2MessageProducer event2MessageProducer;
private TransactionService transactionService;
private PersonService personService;
protected NodeResourceHelper nodeResourceHelper;
protected NodeDAO nodeDAO;
private EventGeneratorQueue eventGeneratorQueue;
private NodeTypeFilter nodeTypeFilter;
private ChildAssociationTypeFilter childAssociationTypeFilter;
private EventUserFilter userFilter;
@@ -109,10 +113,11 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
PropertyCheck.mandatory(this, "dictionaryService", dictionaryService);
PropertyCheck.mandatory(this, "descriptorService", descriptorService);
PropertyCheck.mandatory(this, "eventFilterRegistry", eventFilterRegistry);
PropertyCheck.mandatory(this, "event2MessageProducer", event2MessageProducer);
PropertyCheck.mandatory(this, "transactionService", transactionService);
PropertyCheck.mandatory(this, "personService", personService);
PropertyCheck.mandatory(this, "nodeResourceHelper", nodeResourceHelper);
PropertyCheck.mandatory(this, "nodeDAO", nodeDAO);
PropertyCheck.mandatory(this, "eventGeneratorQueue", eventGeneratorQueue);
this.nodeTypeFilter = eventFilterRegistry.getNodeTypeFilter();
this.childAssociationTypeFilter = eventFilterRegistry.getChildAssociationTypeFilter();
@@ -145,6 +150,11 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
new JavaBehaviour(this, "beforeDeleteAssociation"));
}
public void setNodeDAO(NodeDAO nodeDAO)
{
this.nodeDAO = nodeDAO;
}
public void setPolicyComponent(PolicyComponent policyComponent)
{
this.policyComponent = policyComponent;
@@ -177,12 +187,6 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
this.eventFilterRegistry = eventFilterRegistry;
}
@SuppressWarnings("unused")
public void setEvent2MessageProducer(Event2MessageProducer event2MessageProducer)
{
this.event2MessageProducer = event2MessageProducer;
}
public void setTransactionService(TransactionService transactionService)
{
this.transactionService = transactionService;
@@ -198,6 +202,11 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
this.nodeResourceHelper = nodeResourceHelper;
}
public void setEventGeneratorQueue(EventGeneratorQueue eventGeneratorQueue)
{
this.eventGeneratorQueue = eventGeneratorQueue;
}
@Override
public void onCreateNode(ChildAssociationRef childAssocRef)
{
@@ -368,15 +377,22 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
return (childAssociationTypeFilter.isExcluded(childAssocType) || (userFilter.isExcluded(user)));
}
private EventInfo getEventInfo(String user)
protected EventInfo getEventInfo(String user)
{
return new EventInfo().setTimestamp(ZonedDateTime.now())
return new EventInfo().setTimestamp(getCurrentTransactionTimestamp())
.setId(UUID.randomUUID().toString())
.setTxnId(AlfrescoTransactionSupport.getTransactionId())
.setPrincipal(user)
.setSource(URI.create("/" + descriptorService.getCurrentRepositoryDescriptor().getId()));
}
private ZonedDateTime getCurrentTransactionTimestamp()
{
Long currentTransactionCommitTime = nodeDAO.getCurrentTransactionCommitTime();
Instant commitTimeMs = Instant.ofEpochMilli(currentTransactionCommitTime);
return ZonedDateTime.ofInstant(commitTimeMs, ZoneOffset.UTC);
}
@Override
protected void onBootstrap(ApplicationEvent applicationEvent)
{
@@ -394,54 +410,72 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
@Override
public void afterCommit()
{
try
if(isTransactionCommitted())
{
final Consolidators consolidators = getTxnConsolidators(this);
// Node events
for (Map.Entry<NodeRef, EventConsolidator> entry : consolidators.getNodes().entrySet())
try
{
EventConsolidator eventConsolidator = entry.getValue();
sendEvent(entry.getKey(), eventConsolidator);
}
final Consolidators consolidators = getTxnConsolidators(this);
// Child assoc events
for (Map.Entry<ChildAssociationRef, ChildAssociationEventConsolidator> entry : consolidators.getChildAssocs().entrySet())
{
ChildAssociationEventConsolidator eventConsolidator = entry.getValue();
sendEvent(entry.getKey(), eventConsolidator);
}
// Node events
for (Map.Entry<NodeRef, EventConsolidator> entry : consolidators.getNodes().entrySet())
{
EventConsolidator eventConsolidator = entry.getValue();
sendEvent(entry.getKey(), eventConsolidator);
}
// Peer assoc events
for (Map.Entry<AssociationRef, PeerAssociationEventConsolidator> entry : consolidators.getPeerAssocs().entrySet())
{
PeerAssociationEventConsolidator eventConsolidator = entry.getValue();
sendEvent(entry.getKey(), eventConsolidator);
// Child assoc events
for (Map.Entry<ChildAssociationRef, ChildAssociationEventConsolidator> entry : consolidators.getChildAssocs().entrySet())
{
ChildAssociationEventConsolidator eventConsolidator = entry.getValue();
sendEvent(entry.getKey(), eventConsolidator);
}
// Peer assoc events
for (Map.Entry<AssociationRef, PeerAssociationEventConsolidator> entry : consolidators.getPeerAssocs().entrySet())
{
PeerAssociationEventConsolidator eventConsolidator = entry.getValue();
sendEvent(entry.getKey(), eventConsolidator);
}
}
catch (Exception e)
{
// Must consume the exception to protect other TransactionListeners
LOGGER.error("Unexpected error while sending repository events", e);
}
}
catch (Exception e)
{
// Must consume the exception to protect other TransactionListeners
LOGGER.error("Unexpected error while sending repository events", e);
}
}
protected void sendEvent(NodeRef nodeRef, EventConsolidator consolidator)
{
EventInfo eventInfo = getEventInfo(AuthenticationUtil.getFullyAuthenticatedUser());
eventGeneratorQueue.accept(()-> createEvent(nodeRef, consolidator, eventInfo));
}
/**
* @return true if a node transaction is not only active, but also committed with modifications.
* This means that a {@link TransactionEntity} object was created.
*/
protected boolean isTransactionCommitted()
{
return nodeDAO.getCurrentTransactionCommitTime() != null;
}
private RepoEvent<?> createEvent(NodeRef nodeRef, EventConsolidator consolidator, EventInfo eventInfo)
{
String user = eventInfo.getPrincipal();
if (consolidator.isTemporaryNode())
{
if (LOGGER.isTraceEnabled())
{
LOGGER.trace("Ignoring temporary node: " + nodeRef);
}
return;
return null;
}
final String user = AuthenticationUtil.getFullyAuthenticatedUser();
// Get the repo event before the filtering,
// so we can take the latest node info into account
final RepoEvent<?> event = consolidator.getRepoEvent(getEventInfo(user));
final RepoEvent<?> event = consolidator.getRepoEvent(eventInfo);
final QName nodeType = consolidator.getNodeType();
if (isFiltered(nodeType, user))
@@ -452,7 +486,7 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
+ ((nodeType == null) ? "Unknown' " : nodeType.toPrefixString())
+ "' created by: " + user);
}
return;
return null;
}
if (event.getType().equals(EventType.NODE_UPDATED.getType()) && consolidator.isResourceBeforeAllFieldsNull())
@@ -461,27 +495,34 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
{
LOGGER.trace("Ignoring node updated event as no fields have been updated: " + nodeRef);
}
return;
return null;
}
logAndSendEvent(event, consolidator.getEventTypes());
logEvent(event, consolidator.getEventTypes());
return event;
}
protected void sendEvent(ChildAssociationRef childAssociationRef, ChildAssociationEventConsolidator consolidator)
{
EventInfo eventInfo = getEventInfo(AuthenticationUtil.getFullyAuthenticatedUser());
eventGeneratorQueue.accept(()-> createEvent(eventInfo, childAssociationRef, consolidator));
}
private RepoEvent<?> createEvent(EventInfo eventInfo, ChildAssociationRef childAssociationRef, ChildAssociationEventConsolidator consolidator)
{
String user = eventInfo.getPrincipal();
if (consolidator.isTemporaryChildAssociation())
{
if (LOGGER.isTraceEnabled())
{
LOGGER.trace("Ignoring temporary child association: " + childAssociationRef);
}
return;
return null;
}
final String user = AuthenticationUtil.getFullyAuthenticatedUser();
// Get the repo event before the filtering,
// so we can take the latest association info into account
final RepoEvent<?> event = consolidator.getRepoEvent(getEventInfo(user));
final RepoEvent<?> event = consolidator.getRepoEvent(eventInfo);
final QName childAssocType = consolidator.getChildAssocType();
if (isFilteredChildAssociation(childAssocType, user))
@@ -492,7 +533,7 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
+ ((childAssocType == null) ? "Unknown' " : childAssocType.toPrefixString())
+ "' created by: " + user);
}
return;
return null;
} else if (childAssociationRef.isPrimary())
{
if (LOGGER.isTraceEnabled())
@@ -501,13 +542,20 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
+ ((childAssocType == null) ? "Unknown' " : childAssocType.toPrefixString())
+ "' created by: " + user);
}
return;
return null;
}
logAndSendEvent(event, consolidator.getEventTypes());
logEvent(event, consolidator.getEventTypes());
return event;
}
protected void sendEvent(AssociationRef peerAssociationRef, PeerAssociationEventConsolidator consolidator)
{
EventInfo eventInfo = getEventInfo(AuthenticationUtil.getFullyAuthenticatedUser());
eventGeneratorQueue.accept(()-> createEvent(eventInfo, peerAssociationRef, consolidator));
}
private RepoEvent<?> createEvent(EventInfo eventInfo, AssociationRef peerAssociationRef, PeerAssociationEventConsolidator consolidator)
{
if (consolidator.isTemporaryPeerAssociation())
{
@@ -515,30 +563,21 @@ public class EventGenerator extends AbstractLifecycleBean implements Initializin
{
LOGGER.trace("Ignoring temporary peer association: " + peerAssociationRef);
}
return;
return null;
}
final String user = AuthenticationUtil.getFullyAuthenticatedUser();
// Get the repo event before the filtering,
// so we can take the latest association info into account
final RepoEvent<?> event = consolidator.getRepoEvent(getEventInfo(user));
logAndSendEvent(event, consolidator.getEventTypes());
RepoEvent<?> event = consolidator.getRepoEvent(eventInfo);
logEvent(event, consolidator.getEventTypes());
return event;
}
protected void logAndSendEvent(RepoEvent<?> event, Deque<EventType> listOfEvents)
private void logEvent(RepoEvent<?> event, Deque<EventType> listOfEvents)
{
if (LOGGER.isTraceEnabled())
{
LOGGER.trace("List of Events:" + listOfEvents);
LOGGER.trace("Sending event:" + event);
}
// Need to execute this in another read txn because Camel expects it
transactionService.getRetryingTransactionHelper().doInTransaction((RetryingTransactionCallback<Void>) () -> {
event2MessageProducer.send(event);
return null;
}, true, false);
}
}

View File

@@ -0,0 +1,179 @@
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.event2;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.Callable;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.Executor;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.TimeUnit;
import org.alfresco.repo.event.v1.model.RepoEvent;
import org.alfresco.util.PropertyCheck;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.InitializingBean;
/*
* This queue allows to create asynchronously the RepoEvent offloading the work to a ThreadPool but
* at the same time it preserves the order of the events
*/
public class EventGeneratorQueue implements InitializingBean
{
protected static final Log LOGGER = LogFactory.getLog(EventGeneratorQueue.class);
protected Executor enqueueThreadPoolExecutor;
protected Executor dequeueThreadPoolExecutor;
protected Event2MessageProducer event2MessageProducer;
protected BlockingQueue<EventInMaking> queue = new LinkedBlockingQueue<>();
protected Runnable listener = createListener();
@Override
public void afterPropertiesSet() throws Exception
{
PropertyCheck.mandatory(this, "enqueueThreadPoolExecutor", enqueueThreadPoolExecutor);
PropertyCheck.mandatory(this, "dequeueThreadPoolExecutor", dequeueThreadPoolExecutor);
PropertyCheck.mandatory(this, "event2MessageProducer", event2MessageProducer);
}
public void setEvent2MessageProducer(Event2MessageProducer event2MessageProducer)
{
this.event2MessageProducer = event2MessageProducer;
}
public void setEnqueueThreadPoolExecutor(Executor enqueueThreadPoolExecutor)
{
this.enqueueThreadPoolExecutor = enqueueThreadPoolExecutor;
}
public void setDequeueThreadPoolExecutor(Executor dequeueThreadPoolExecutor)
{
this.dequeueThreadPoolExecutor = dequeueThreadPoolExecutor;
dequeueThreadPoolExecutor.execute(listener);
}
/**
* Procedure to enqueue the callback functions that creates an event.
* @param maker Callback function that creates an event.
*/
public void accept(Callable<RepoEvent<?>> maker)
{
EventInMaking eventInMaking = new EventInMaking(maker);
queue.offer(eventInMaking);
enqueueThreadPoolExecutor.execute(() -> {
try
{
eventInMaking.make();
}
catch (Exception e)
{
LOGGER.error("Unexpected error while enqueuing maker function for repository event" + e);
}
});
}
/**
* Create listener task in charge of dequeuing and sending events ready to be sent.
* @return The task in charge of dequeuing and sending events ready to be sent.
*/
private Runnable createListener()
{
return new Runnable()
{
@Override
public void run()
{
try
{
while (!Thread.interrupted())
{
try
{
EventInMaking eventInMaking = queue.take();
RepoEvent<?> event = eventInMaking.getEventWhenReady();
if (event != null)
{
event2MessageProducer.send(event);
}
}
catch (Exception e)
{
LOGGER.error("Unexpected error while dequeuing and sending repository event" + e);
}
}
}
finally
{
LOGGER.warn("Unexpected: rescheduling the listener thread.");
dequeueThreadPoolExecutor.execute(listener);
}
}
};
}
/*
* Simple class that makes events and allows to retrieve them when ready
*/
private static class EventInMaking
{
private Callable<RepoEvent<?>> maker;
private volatile RepoEvent<?> event;
private CountDownLatch latch;
public EventInMaking(Callable<RepoEvent<?>> maker)
{
this.maker = maker;
this.latch = new CountDownLatch(1);
}
public void make() throws Exception
{
try
{
event = maker.call();
}
finally
{
latch.countDown();
}
}
public RepoEvent<?> getEventWhenReady() throws InterruptedException
{
latch.await(30, TimeUnit.SECONDS);
return event;
}
@Override
public String toString()
{
return maker.toString();
}
}
}

View File

@@ -1,28 +1,28 @@
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.forms.processor.node;
import static org.alfresco.repo.forms.processor.node.FormFieldConstants.ASSOC_DATA_ADDED_SUFFIX;
@@ -634,9 +634,12 @@ public abstract class ContentModelFormProcessor<ItemType, PersistType> extends
{
try
{
// if the name property changes the rename method of the file folder
// service should be called rather than updating the property directly
this.fileFolderService.rename(nodeRef, (String) fieldData.getValue());
if (!fileInfo.getName().equals(fieldData.getValue()))
{
// if the name property changes, the rename method of the file folder
// service should be called rather than updating the property directly
this.fileFolderService.rename(nodeRef, (String) fieldData.getValue());
}
}
catch (FileExistsException fee)
{

View File

@@ -1,6 +1,6 @@
/*-
/*
* #%L
* Alfresco Remote API
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
@@ -23,54 +23,29 @@
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.search.impl.querymodel.impl.db;
package org.alfresco.repo.search;
import java.util.concurrent.TimeUnit;
import javax.annotation.concurrent.NotThreadSafe;
@NotThreadSafe
public class SingleTaskRestartableWatch
/**
* Additional metadata ops available for {@link org.alfresco.service.cmr.search.ResultSet} coming from a search engine.
*
* @author Gethin James
* @since 5.0
* @see SearchEngineResultSet
*/
public interface SearchEngineResultMetadata
{
private final String name;
/**
* Returns the query execution time, or put in other words, the amount of
* time the search engine spent for processing the request.
*
* @return the query execution time
*/
Long getQueryTime();
private long startTimeNanos;
private long totalTimeNanos;
public SingleTaskRestartableWatch(String name)
{
this.name = name;
}
public String getName()
{
return name;
}
public void start()
{
startTimeNanos = System.nanoTime();
}
public void stop()
{
long elapsedNanos = System.nanoTime() - startTimeNanos;
totalTimeNanos += elapsedNanos;
}
public long getTotalTimeNanos()
{
return totalTimeNanos;
}
public long getTotalTimeMicros()
{
return TimeUnit.NANOSECONDS.toMicros(totalTimeNanos);
}
public long getTotalTimeMillis()
{
return TimeUnit.NANOSECONDS.toMillis(totalTimeNanos);
}
}
/**
* Total number of items matching a the current query execution.
*
* @return the number of items in the search index that matched a query execution.
*/
long getNumberFound();
}

View File

@@ -0,0 +1,57 @@
/*
* #%L
* Alfresco Data model classes
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.search;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.GenericFacetResponse;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.Metric;
import org.alfresco.service.cmr.search.ResultSet;
import org.alfresco.util.Pair;
import java.util.List;
import java.util.Map;
import java.util.Set;
/**
* Supertype layer interface for all resultset coming from a search engine (e.g. Elasticsearch, Solr)
* This interface has been originally extracted from the Apache Solr ResultSet implementation,
* that's the reason why the naming used for denoting some things (e.g. facets) is tied to the Solr world.
*/
public interface SearchEngineResultSet extends ResultSet, SearchEngineResultMetadata
{
Map<String, List<Pair<String, Integer>>> getFieldFacets();
Map<String, List<Pair<String, Integer>>> getFacetIntervals();
Map<String, List<Map<String, String>>> getFacetRanges();
List<GenericFacetResponse> getPivotFacets();
Map<String, Set<Metric>> getStats();
long getLastIndexedTxId();
boolean getProcessedDenies();
}

View File

@@ -26,8 +26,6 @@
package org.alfresco.repo.search.impl.querymodel.impl.db;
import static org.alfresco.repo.domain.node.AbstractNodeDAOImpl.CACHE_REGION_NODES;
import static org.alfresco.repo.search.impl.querymodel.impl.db.DBStats.handlerStopWatch;
import static org.alfresco.repo.search.impl.querymodel.impl.db.DBStats.resetStopwatches;
import java.io.Serializable;
import java.util.ArrayList;
@@ -35,7 +33,6 @@ import java.util.BitSet;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
@@ -49,7 +46,6 @@ import org.alfresco.repo.cache.lookup.EntityLookupCache;
import org.alfresco.repo.cache.lookup.EntityLookupCache.EntityLookupCallbackDAOAdaptor;
import org.alfresco.repo.domain.node.Node;
import org.alfresco.repo.domain.node.NodeDAO;
import org.alfresco.repo.domain.node.NodeVersionKey;
import org.alfresco.repo.domain.permissions.AclCrudDAO;
import org.alfresco.repo.domain.permissions.Authority;
import org.alfresco.repo.domain.qname.QNameDAO;
@@ -81,7 +77,6 @@ import org.apache.commons.logging.LogFactory;
import org.apache.ibatis.session.ResultContext;
import org.apache.ibatis.session.ResultHandler;
import org.mybatis.spring.SqlSessionTemplate;
import org.springframework.util.StopWatch;
/**
* @author Andy
@@ -115,12 +110,8 @@ public class DBQueryEngine implements QueryEngine
private long maxPermissionCheckTimeMillis;
private SimpleCache<NodeVersionKey, Map<QName, Serializable>> propertiesCache;
protected EntityLookupCache<Long, Node, NodeRef> nodesCache;
private SimpleCache<NodeVersionKey, Set<QName>> aspectsCache;
AclCrudDAO aclCrudDAO;
public void setAclCrudDAO(AclCrudDAO aclCrudDAO)
@@ -216,9 +207,13 @@ public class DBQueryEngine implements QueryEngine
@Override
public QueryEngineResults executeQuery(Query query, QueryOptions options, FunctionEvaluationContext functionContext)
{
logger.debug("Query request received");
resetStopwatches();
long start = 0;
if (logger.isDebugEnabled())
{
start = System.currentTimeMillis();
logger.debug("Query request received");
}
Set<String> selectorGroup = null;
if (query.getSource() != null)
{
@@ -275,29 +270,14 @@ public class DBQueryEngine implements QueryEngine
logger.debug("- query is being prepared");
dbQuery.prepare(namespaceService, dictionaryService, qnameDAO, nodeDAO, tenantService, selectorGroup,
null, functionContext, metadataIndexCheck2.getPatchApplied());
ResultSet resultSet;
// TEMPORARY - this first branch of the if statement simply allows us to easily clear the caches for now; it will be removed afterwards
if (cleanCacheRequest(options))
resultSet = selectNodesWithPermissions(options, dbQuery);
if (logger.isDebugEnabled())
{
nodesCache.clear();
propertiesCache.clear();
aspectsCache.clear();
logger.info("Nodes cache cleared");
resultSet = new DBResultSet(options.getAsSearchParmeters(), Collections.emptyList(), nodeDAO, nodeService,
tenantService, Integer.MAX_VALUE);
long ms = System.currentTimeMillis() - start;
logger.debug("Selected " + resultSet.length() + " nodes with permission resolution in "+ms+" ms");
}
else if (forceOldPermissionResolution(options))
{
resultSet = selectNodesStandard(options, dbQuery);
logger.debug("Selected " +resultSet.length()+ " nodes with standard permission resolution");
}
else
{
resultSet = selectNodesWithPermissions(options, dbQuery);
logger.debug("Selected " +resultSet.length()+ " nodes with accelerated permission resolution");
}
return asQueryEngineResults(resultSet);
}
@@ -306,14 +286,7 @@ public class DBQueryEngine implements QueryEngine
logger.debug("- using standard table for the query");
return SELECT_BY_DYNAMIC_QUERY;
}
private ResultSet selectNodesStandard(QueryOptions options, DBQuery dbQuery)
{
List<Node> nodes = removeDuplicates(template.selectList(pickQueryTemplate(options, dbQuery), dbQuery));
DBResultSet rs = new DBResultSet(options.getAsSearchParmeters(), nodes, nodeDAO, nodeService, tenantService, Integer.MAX_VALUE);
return new PagingLuceneResultSet(rs, options.getAsSearchParmeters(), nodeService);
}
private ResultSet selectNodesWithPermissions(QueryOptions options, DBQuery dbQuery)
{
Authority authority = aclCrudDAO.getAuthority(AuthenticationUtil.getRunAsUser());
@@ -340,37 +313,21 @@ public class DBQueryEngine implements QueryEngine
FilteringResultSet acceleratedNodeSelection(QueryOptions options, DBQuery dbQuery, NodePermissionAssessor permissionAssessor)
{
StopWatch sw = DBStats.queryStopWatch();
List<Node> nodes = new ArrayList<>();
int requiredNodes = computeRequiredNodesCount(options);
logger.debug("- query sent to the database");
sw.start("ttfr");
template.select(pickQueryTemplate(options, dbQuery), dbQuery, new ResultHandler<Node>()
{
@Override
public void handleResult(ResultContext<? extends Node> context)
{
handlerStopWatch().start();
try
{
doHandleResult(permissionAssessor, sw, nodes, requiredNodes, context);
}
finally
{
handlerStopWatch().stop();
}
doHandleResult(permissionAssessor, nodes, requiredNodes, context);
}
private void doHandleResult(NodePermissionAssessor permissionAssessor, StopWatch sw, List<Node> nodes,
int requiredNodes, ResultContext<? extends Node> context)
private void doHandleResult(NodePermissionAssessor permissionAssessor, List<Node> nodes,
int requiredNodes, ResultContext<? extends Node> context)
{
if (permissionAssessor.isFirstRecord())
{
sw.stop();
sw.start("ttlr");
}
if (nodes.size() >= requiredNodes)
{
context.stop();
@@ -400,10 +357,8 @@ public class DBQueryEngine implements QueryEngine
context.stop();
return;
}
}
});
sw.stop();
int numberFound = nodes.size();
nodes.removeAll(Collections.singleton(null));
@@ -455,23 +410,6 @@ public class DBQueryEngine implements QueryEngine
return new QueryEngineResults(answer);
}
private List<Node> removeDuplicates(List<Node> nodes)
{
LinkedHashSet<Node> uniqueNodes = new LinkedHashSet<>(nodes.size());
List<Long> checkedNodeIds = new ArrayList<>(nodes.size());
for (Node node : nodes)
{
if (!checkedNodeIds.contains(node.getId()))
{
checkedNodeIds.add(node.getId());
uniqueNodes.add(node);
}
}
return new ArrayList<Node>(uniqueNodes);
}
/*
* (non-Javadoc)
* @see org.alfresco.repo.search.impl.querymodel.QueryEngine#getQueryModelFactory()
@@ -481,28 +419,7 @@ public class DBQueryEngine implements QueryEngine
{
return new DBQueryModelFactory();
}
private boolean cleanCacheRequest(QueryOptions options)
{
return "xxx".equals(getLocaleLanguage(options));
}
char getMagicCharFromLocale(QueryOptions options, int index)
{
String lang = getLocaleLanguage(options);
return lang.length() > index ? lang.charAt(index) : ' ';
}
private boolean forceOldPermissionResolution(QueryOptions options)
{
return getMagicCharFromLocale(options, 2) == 's';
}
private String getLocaleLanguage(QueryOptions options)
{
return options.getLocales().size() == 1 ? options.getLocales().get(0).getLanguage() : "";
}
/**
* Injection of nodes cache for clean-up and warm up when required
* @param cache The node cache to set
@@ -540,20 +457,4 @@ public class DBQueryEngine implements QueryEngine
return value.getNodeRef();
}
}
/*
* TEMPORARY - Injection of nodes cache for clean-up when required
*/
public void setPropertiesCache(SimpleCache<NodeVersionKey, Map<QName, Serializable>> propertiesCache)
{
this.propertiesCache = propertiesCache;
}
/*
* TEMPORARY - Injection of nodes cache for clean-up when required
*/
public void setAspectsCache(SimpleCache<NodeVersionKey, Set<QName>> aspectsCache)
{
this.aspectsCache = aspectsCache;
}
}

View File

@@ -1,66 +0,0 @@
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.search.impl.querymodel.impl.db;
import org.springframework.util.StopWatch;
public final class DBStats
{
public static final ThreadLocal<StopWatch> QUERY_STOPWATCH = new ThreadLocal<StopWatch>();
public static final ThreadLocal<SingleTaskRestartableWatch> ACL_READ_STOPWATCH = new ThreadLocal<SingleTaskRestartableWatch>();
public static final ThreadLocal<SingleTaskRestartableWatch> ACL_OWNER_STOPWATCH = new ThreadLocal<SingleTaskRestartableWatch>();
public static final ThreadLocal<SingleTaskRestartableWatch> HANDLER_STOPWATCH = new ThreadLocal<SingleTaskRestartableWatch>();
private DBStats() {}
public static void resetStopwatches()
{
QUERY_STOPWATCH.set(new StopWatch());
HANDLER_STOPWATCH.set(new SingleTaskRestartableWatch("tot"));
ACL_READ_STOPWATCH.set(new SingleTaskRestartableWatch("acl"));
ACL_OWNER_STOPWATCH.set(new SingleTaskRestartableWatch("own"));
}
public static StopWatch queryStopWatch()
{
return QUERY_STOPWATCH.get();
}
public static SingleTaskRestartableWatch aclReadStopWatch()
{
return ACL_READ_STOPWATCH.get();
}
public static SingleTaskRestartableWatch aclOwnerStopWatch()
{
return ACL_OWNER_STOPWATCH.get();
}
public static SingleTaskRestartableWatch handlerStopWatch()
{
return HANDLER_STOPWATCH.get();
}
}

View File

@@ -25,9 +25,6 @@
*/
package org.alfresco.repo.search.impl.querymodel.impl.db;
import static org.alfresco.repo.search.impl.querymodel.impl.db.DBStats.aclOwnerStopWatch;
import static org.alfresco.repo.search.impl.querymodel.impl.db.DBStats.aclReadStopWatch;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
@@ -98,21 +95,13 @@ public class NodePermissionAssessor
protected boolean isOwnerReading(Node node, Authority authority)
{
aclOwnerStopWatch().start();
try
if (authority == null)
{
if (authority == null)
{
return false;
}
String owner = getOwner(node);
return EqualsHelper.nullSafeEquals(authority.getAuthority(), owner);
}
finally
{
aclOwnerStopWatch().stop();
return false;
}
String owner = getOwner(node);
return EqualsHelper.nullSafeEquals(authority.getAuthority(), owner);
}
private String getOwner(Node node)
@@ -176,21 +165,13 @@ public class NodePermissionAssessor
protected boolean canRead(Long aclId)
{
aclReadStopWatch().start();
try
Boolean res = aclReadCache.get(aclId);
if (res == null)
{
Boolean res = aclReadCache.get(aclId);
if (res == null)
{
res = canCurrentUserRead(aclId);
aclReadCache.put(aclId, res);
}
return res;
}
finally
{
aclReadStopWatch().stop();
res = canCurrentUserRead(aclId);
aclReadCache.put(aclId, res);
}
return res;
}
protected boolean canCurrentUserRead(Long aclId)

View File

@@ -2,7 +2,7 @@
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* Copyright (C) 2005 - 2021 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
@@ -38,8 +38,8 @@ import java.util.stream.Collectors;
import org.alfresco.repo.domain.node.NodeDAO;
import org.alfresco.repo.search.QueryParserException;
import org.alfresco.repo.search.SearchEngineResultSet;
import org.alfresco.repo.search.SimpleResultSetMetaData;
import org.alfresco.repo.search.impl.JSONResult;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.GenericBucket;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.GenericFacetResponse;
import org.alfresco.repo.search.impl.solr.facet.facetsresponse.GenericFacetResponse.FACET_TYPE;
@@ -55,7 +55,6 @@ import org.alfresco.service.cmr.repository.NodeService;
import org.alfresco.service.cmr.search.LimitBy;
import org.alfresco.service.cmr.search.PermissionEvaluationMode;
import org.alfresco.service.cmr.search.RangeParameters;
import org.alfresco.service.cmr.search.ResultSet;
import org.alfresco.service.cmr.search.ResultSetMetaData;
import org.alfresco.service.cmr.search.ResultSetRow;
import org.alfresco.service.cmr.search.SearchParameters;
@@ -68,10 +67,9 @@ import org.json.JSONException;
import org.json.JSONObject;
/**
* @author Andy
* Apache Solr {@link SearchEngineResultSet} implementation.
*/
public class SolrJSONResultSet implements ResultSet, JSONResult
{
public class SolrJSONResultSet implements SearchEngineResultSet {
private static final Log logger = LogFactory.getLog(SolrJSONResultSet.class);
private NodeService nodeService;
@@ -730,6 +728,7 @@ public class SolrJSONResultSet implements ResultSet, JSONResult
/**
* @return the queryTime
*/
@Override
public Long getQueryTime()
{
return queryTime;
@@ -739,6 +738,7 @@ public class SolrJSONResultSet implements ResultSet, JSONResult
/**
* @return the numberFound
*/
@Override
public long getNumberFound()
{
return numberFound.longValue();
@@ -758,26 +758,31 @@ public class SolrJSONResultSet implements ResultSet, JSONResult
}
}
@Override
public Map<String, List<Pair<String, Integer>>> getFieldFacets()
{
return Collections.unmodifiableMap(fieldFacets);
}
@Override
public Map<String, List<Pair<String, Integer>>> getFacetIntervals()
{
return Collections.unmodifiableMap(facetIntervals);
}
@Override
public List<GenericFacetResponse> getPivotFacets()
{
return pivotFacets;
}
@Override
public Map<String, Set<Metric>> getStats()
{
return Collections.unmodifiableMap(stats);
}
@Override
public long getLastIndexedTxId()
{
return lastIndexedTxId;
@@ -801,11 +806,13 @@ public class SolrJSONResultSet implements ResultSet, JSONResult
return this.spellCheckResult;
}
@Override
public boolean getProcessedDenies()
{
return processedDenies;
}
@Override
public Map<String,List<Map<String,String>>> getFacetRanges()
{
return facetRanges;

View File

@@ -25,7 +25,7 @@
*/
package org.alfresco.repo.search.impl.solr;
import org.alfresco.repo.search.impl.JSONResult;
import org.alfresco.repo.search.SearchEngineResultMetadata;
import org.json.JSONObject;
/**
@@ -34,7 +34,7 @@ import org.json.JSONObject;
* @author Gethin James
*/
@FunctionalInterface
public interface SolrJsonProcessor<T extends JSONResult>
public interface SolrJsonProcessor<T extends SearchEngineResultMetadata>
{
public T getResult(JSONObject json);
}

View File

@@ -54,7 +54,7 @@ import org.alfresco.repo.index.shard.Floc;
import org.alfresco.repo.index.shard.ShardRegistry;
import org.alfresco.repo.search.QueryParserException;
import org.alfresco.repo.search.impl.QueryParserUtils;
import org.alfresco.repo.search.impl.JSONResult;
import org.alfresco.repo.search.SearchEngineResultMetadata;
import org.alfresco.repo.tenant.TenantService;
import org.alfresco.service.cmr.dictionary.DataTypeDefinition;
import org.alfresco.service.cmr.dictionary.DictionaryService;
@@ -1098,14 +1098,14 @@ public class SolrQueryHTTPClient extends AbstractSolrQueryHTTPClient implements
}
}
protected JSONResult postSolrQuery(HttpClient httpClient, String url, JSONObject body, SolrJsonProcessor<?> jsonProcessor)
protected SearchEngineResultMetadata postSolrQuery(HttpClient httpClient, String url, JSONObject body, SolrJsonProcessor<?> jsonProcessor)
throws UnsupportedEncodingException, IOException, HttpException, URIException,
JSONException
{
return postSolrQuery(httpClient, url, body, jsonProcessor, null);
}
protected JSONResult postSolrQuery(HttpClient httpClient, String url, JSONObject body, SolrJsonProcessor<?> jsonProcessor, String spellCheckParams)
protected SearchEngineResultMetadata postSolrQuery(HttpClient httpClient, String url, JSONObject body, SolrJsonProcessor<?> jsonProcessor, String spellCheckParams)
throws UnsupportedEncodingException, IOException, HttpException, URIException,
JSONException
{
@@ -1120,7 +1120,7 @@ public class SolrQueryHTTPClient extends AbstractSolrQueryHTTPClient implements
json.put("spellcheck", manager.getSpellCheckJsonValue());
}
JSONResult results = jsonProcessor.getResult(json);
SearchEngineResultMetadata results = jsonProcessor.getResult(json);
if (s_logger.isDebugEnabled())
{

View File

@@ -37,7 +37,7 @@ import org.alfresco.repo.admin.RepositoryState;
import org.alfresco.repo.index.shard.Floc;
import org.alfresco.repo.index.shard.ShardRegistry;
import org.alfresco.repo.search.QueryParserException;
import org.alfresco.repo.search.impl.JSONResult;
import org.alfresco.repo.search.SearchEngineResultMetadata;
import org.alfresco.repo.tenant.TenantService;
import org.alfresco.service.cmr.repository.StoreRef;
import org.alfresco.service.cmr.repository.datatype.DefaultTypeConverter;
@@ -222,7 +222,7 @@ public class SolrSQLHttpClient extends AbstractSolrQueryHTTPClient implements So
SolrJsonProcessor<?> jsonProcessor) throws IOException, JSONException
{
JSONObject json = postQuery(httpClient, url, body);
JSONResult results = jsonProcessor.getResult(json);
SearchEngineResultMetadata results = jsonProcessor.getResult(json);
LOGGER.debug("Sent : {}", url);
LOGGER.debug("with: {}", body);
LOGGER.debug("Got: {} in {} ms", results.getNumberFound(), results.getQueryTime());

View File

@@ -30,7 +30,7 @@ import java.util.List;
import java.util.Map;
import org.alfresco.repo.search.SimpleResultSetMetaData;
import org.alfresco.repo.search.impl.JSONResult;
import org.alfresco.repo.search.SearchEngineResultMetadata;
import org.alfresco.service.cmr.repository.ChildAssociationRef;
import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.search.LimitBy;
@@ -51,7 +51,7 @@ import org.json.JSONObject;
* Pojo that parses and stores solr stream response.
* @author Michael Suzuki
*/
public class SolrSQLJSONResultSet implements ResultSet, JSONResult
public class SolrSQLJSONResultSet implements ResultSet, SearchEngineResultMetadata
{
private static final String SOLR_STREAM_EXCEPTION = "EXCEPTION";
private static Log logger = LogFactory.getLog(SolrSQLJSONResultSet.class);

View File

@@ -28,7 +28,7 @@ package org.alfresco.repo.search.impl.solr;
import java.util.ArrayList;
import java.util.List;
import org.alfresco.repo.search.impl.JSONResult;
import org.alfresco.repo.search.SearchEngineResultMetadata;
import org.alfresco.service.cmr.search.StatsResultSet;
import org.alfresco.service.cmr.search.StatsResultStat;
import org.apache.commons.logging.Log;
@@ -44,7 +44,7 @@ import org.springframework.util.StringUtils;
* @author Gethin James
* @since 5.0
*/
public class SolrStatsResult implements JSONResult, StatsResultSet
public class SolrStatsResult implements SearchEngineResultMetadata, StatsResultSet
{
private static final Log logger = LogFactory.getLog(SolrStatsResult.class);

View File

@@ -32,8 +32,12 @@ import java.sql.DatabaseMetaData;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.LinkedHashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
import java.util.TreeMap;
import java.util.stream.Collectors;
import javax.sql.DataSource;
@@ -41,10 +45,10 @@ import org.alfresco.repo.domain.dialect.Dialect;
import org.alfresco.repo.domain.dialect.TypeNames;
import org.alfresco.service.descriptor.Descriptor;
import org.alfresco.service.descriptor.DescriptorService;
import org.alfresco.util.DBScriptUtil;
import org.alfresco.util.DatabaseMetaDataHelper;
import org.alfresco.util.DialectUtil;
import org.alfresco.util.PropertyCheck;
import org.alfresco.util.DBScriptUtil;
import org.alfresco.util.schemacomp.model.Column;
import org.alfresco.util.schemacomp.model.ForeignKey;
import org.alfresco.util.schemacomp.model.Index;
@@ -257,6 +261,7 @@ public class ExportDb
tableTypes = new String[] { "TABLE", "VIEW", "SEQUENCE" };
}
// No tables are returned when using MySQL8 - maybe this issue can be solved if we update the MySQL JDBC driver that we use (the new driver class is `com.mysql.cj.jdbc.Driver').
final ResultSet tables = dbmd.getTables(null, schemaName, prefixFilter, tableTypes);
processTables(dbmd, tables);
@@ -331,10 +336,11 @@ public class ExportDb
columns.close();
// Primary key
// Primary key - beware that getPrimaryKeys gets primary keys ordered by their column name
final ResultSet primarykeycols = dbmd.getPrimaryKeys(null, tables.getString("TABLE_SCHEM"), tableName);
PrimaryKey pk = null;
Map<Integer, String> keySeqsAndColumnNames = new LinkedHashMap<>();
while (primarykeycols.next())
{
@@ -343,12 +349,23 @@ public class ExportDb
String pkName = primarykeycols.getString("PK_NAME");
pk = new PrimaryKey(pkName);
}
String columnName = primarykeycols.getString("COLUMN_NAME");
pk.getColumnNames().add(columnName);
// We should add columns ordered by the KEY_SEQ rather than by the column name
// Populating map with key sequences and column names for a proper sorting later.
int columnOrder = primarykeycols.getInt("KEY_SEQ");
pk.getColumnOrders().add(columnOrder);
String columnName = primarykeycols.getString("COLUMN_NAME");
keySeqsAndColumnNames.put(columnOrder, columnName);
}
List<String> keyseqSortedColumnNames = new LinkedList<>();
List<Integer> keySeqSortedColumnOrders = keySeqsAndColumnNames.keySet().stream().sorted().collect(Collectors.toList());
for (int keySeq: keySeqSortedColumnOrders)
{
keyseqSortedColumnNames.add(keySeqsAndColumnNames.get(keySeq));
}
pk.setColumnOrders(keySeqSortedColumnOrders);
pk.setColumnNames(keyseqSortedColumnNames);
primarykeycols.close();
// If this table has a primary key, add it.

View File

@@ -162,16 +162,28 @@ CREATE TABLE alf_authority_alias
CONSTRAINT fk_alf_autha_ali FOREIGN KEY (alias_id) REFERENCES alf_authority (id)
) ENGINE=InnoDB;
CREATE TABLE alf_server
(
id BIGINT NOT NULL AUTO_INCREMENT,
version BIGINT NOT NULL,
ip_address VARCHAR(39) NOT NULL,
PRIMARY KEY (id),
UNIQUE KEY ip_address (ip_address)
) ENGINE=InnoDB;
CREATE TABLE alf_transaction
(
id BIGINT NOT NULL AUTO_INCREMENT,
version BIGINT NOT NULL,
server_id BIGINT,
change_txn_id VARCHAR(56) NOT NULL,
commit_time_ms BIGINT,
PRIMARY KEY (id),
KEY idx_alf_txn_ctms (commit_time_ms, id),
KEY idx_alf_txn_ctms_sc (commit_time_ms),
key idx_alf_txn_id_ctms (id, commit_time_ms)
key idx_alf_txn_id_ctms (id, commit_time_ms),
KEY fk_alf_txn_svr (server_id),
CONSTRAINT fk_alf_txn_svr FOREIGN KEY (server_id) REFERENCES alf_server (id)
) ENGINE=InnoDB;
CREATE TABLE alf_store

View File

@@ -898,8 +898,8 @@
</columns>
<primarykey name="PRIMARY">
<columnnames>
<columnname order="2">GROUP_ID_</columnname>
<columnname order="1">USER_ID_</columnname>
<columnname order="2">GROUP_ID_</columnname>
</columnnames>
</primarykey>
<foreignkeys>

View File

@@ -1885,10 +1885,10 @@
</columns>
<primarykey name="PRIMARY">
<columnnames>
<columnname order="3">list_index</columnname>
<columnname order="4">locale_id</columnname>
<columnname order="1">node_id</columnname>
<columnname order="2">qname_id</columnname>
<columnname order="3">list_index</columnname>
<columnname order="4">locale_id</columnname>
</columnnames>
</primarykey>
<foreignkeys>
@@ -2174,9 +2174,9 @@
</columns>
<primarykey name="PRIMARY">
<columnnames>
<columnname order="1">root_prop_id</columnname>
<columnname order="2">contained_in</columnname>
<columnname order="3">prop_index</columnname>
<columnname order="1">root_prop_id</columnname>
</columnnames>
</primarykey>
<foreignkeys>
@@ -2472,7 +2472,39 @@
<columnname>local_name</columnname>
</columnnames>
</index>
</indexes>
</indexes>
</table>
<table name="alf_server">
<columns>
<column name="id" order="1">
<type>bigint</type>
<nullable>false</nullable>
<autoincrement>true</autoincrement>
</column>
<column name="version" order="2">
<type>bigint</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="ip_address" order="3">
<type>varchar(39)</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
</columns>
<primarykey name="PRIMARY">
<columnnames>
<columnname order="1">id</columnname>
</columnnames>
</primarykey>
<foreignkeys/>
<indexes>
<index name="ip_address" unique="true">
<columnnames>
<columnname>ip_address</columnname>
</columnnames>
</index>
</indexes>
</table>
<table name="alf_store">
<columns>
@@ -2543,8 +2575,8 @@
</columns>
<primarykey name="PRIMARY">
<columnnames>
<columnname order="2">node_id</columnname>
<columnname order="1">user_node_id</columnname>
<columnname order="2">node_id</columnname>
</columnnames>
</primarykey>
<foreignkeys>
@@ -2620,12 +2652,17 @@
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="change_txn_id" order="3">
<column name="server_id" order="3">
<type>bigint</type>
<nullable>true</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="change_txn_id" order="4">
<type>varchar(56)</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="commit_time_ms" order="4">
<column name="commit_time_ms" order="5">
<type>bigint</type>
<nullable>true</nullable>
<autoincrement>false</autoincrement>
@@ -2636,6 +2673,13 @@
<columnname order="1">id</columnname>
</columnnames>
</primarykey>
<foreignkeys>
<foreignkey name="fk_alf_txn_svr">
<localcolumn>server_id</localcolumn>
<targettable>alf_server</targettable>
<targetcolumn>id</targetcolumn>
</foreignkey>
</foreignkeys>
<indexes>
<index name="idx_alf_txn_ctms" unique="false">
<columnnames>
@@ -2654,6 +2698,11 @@
<columnname>commit_time_ms</columnname>
</columnnames>
</index>
<index name="fk_alf_txn_svr" unique="false">
<columnnames>
<columnname>server_id</columnname>
</columnnames>
</index>
</indexes>
</table>
<table name="alf_usage_delta">

View File

@@ -29,7 +29,7 @@ CREATE TABLE alf_locale
id INT8 NOT NULL,
version INT8 NOT NULL,
locale_str VARCHAR(20) NOT NULL,
PRIMARY KEY (id)
PRIMARY KEY (id)
);
CREATE UNIQUE INDEX locale_str ON alf_locale (locale_str);
@@ -50,7 +50,7 @@ CREATE TABLE alf_qname
version INT8 NOT NULL,
ns_id INT8 NOT NULL,
local_name VARCHAR(200) NOT NULL,
CONSTRAINT fk_alf_qname_ns FOREIGN KEY (ns_id) REFERENCES alf_namespace (id),
CONSTRAINT fk_alf_qname_ns FOREIGN KEY (ns_id) REFERENCES alf_namespace (id),
PRIMARY KEY (id)
);
CREATE UNIQUE INDEX ns_id ON alf_qname (ns_id, local_name);
@@ -62,7 +62,7 @@ CREATE TABLE alf_permission
version INT8 NOT NULL,
type_qname_id INT8 NOT NULL,
name VARCHAR(100) NOT NULL,
PRIMARY KEY (id),
PRIMARY KEY (id),
CONSTRAINT fk_alf_perm_tqn FOREIGN KEY (type_qname_id) REFERENCES alf_qname (id)
);
CREATE UNIQUE INDEX type_qname_id ON alf_permission (type_qname_id, name);
@@ -101,7 +101,7 @@ CREATE TABLE alf_access_control_entry
allowed BOOL NOT NULL,
applies INT4 NOT NULL,
context_id INT8,
PRIMARY KEY (id),
PRIMARY KEY (id),
CONSTRAINT fk_alf_ace_auth FOREIGN KEY (authority_id) REFERENCES alf_authority (id),
CONSTRAINT fk_alf_ace_ctx FOREIGN KEY (context_id) REFERENCES alf_ace_context (id),
CONSTRAINT fk_alf_ace_perm FOREIGN KEY (permission_id) REFERENCES alf_permission (id)
@@ -151,7 +151,7 @@ CREATE TABLE alf_acl_member
acl_id INT8 NOT NULL,
ace_id INT8 NOT NULL,
pos INT4 NOT NULL,
PRIMARY KEY (id),
PRIMARY KEY (id),
CONSTRAINT fk_alf_aclm_ace FOREIGN KEY (ace_id) REFERENCES alf_access_control_entry (id),
CONSTRAINT fk_alf_aclm_acl FOREIGN KEY (acl_id) REFERENCES alf_access_control_list (id)
);
@@ -174,18 +174,31 @@ CREATE UNIQUE INDEX auth_id ON alf_authority_alias (auth_id, alias_id);
CREATE INDEX fk_alf_autha_ali ON alf_authority_alias (alias_id);
CREATE INDEX fk_alf_autha_aut ON alf_authority_alias (auth_id);
CREATE SEQUENCE alf_server_seq START WITH 1 INCREMENT BY 1;
CREATE TABLE alf_server
(
id INT8 NOT NULL,
version INT8 NOT NULL,
ip_address VARCHAR(39) NOT NULL,
PRIMARY KEY (id)
);
CREATE UNIQUE INDEX ip_address ON alf_server (ip_address);
CREATE SEQUENCE alf_transaction_seq START WITH 1 INCREMENT BY 1;
CREATE TABLE alf_transaction
(
id INT8 NOT NULL,
version INT8 NOT NULL,
server_id INT8,
change_txn_id VARCHAR(56) NOT NULL,
commit_time_ms INT8,
PRIMARY KEY (id)
PRIMARY KEY (id),
CONSTRAINT fk_alf_txn_svr FOREIGN KEY (server_id) REFERENCES alf_server (id)
);
CREATE INDEX idx_alf_txn_ctms ON alf_transaction (commit_time_ms, id);
CREATE INDEX idx_alf_txn_ctms_sc ON alf_transaction (commit_time_ms);
CREATE INDEX idx_alf_txn_id_ctms ON alf_transaction (id, commit_time_ms);
CREATE INDEX fk_alf_txn_svr ON alf_transaction (server_id);
CREATE SEQUENCE alf_store_seq START WITH 1 INCREMENT BY 1;
CREATE TABLE alf_store
@@ -254,7 +267,7 @@ CREATE TABLE alf_child_assoc
qname_crc INT8 NOT NULL,
is_primary BOOL,
assoc_index INT4,
PRIMARY KEY (id),
PRIMARY KEY (id),
CONSTRAINT fk_alf_cass_cnode FOREIGN KEY (child_node_id) REFERENCES alf_node (id),
CONSTRAINT fk_alf_cass_pnode FOREIGN KEY (parent_node_id) REFERENCES alf_node (id),
CONSTRAINT fk_alf_cass_qnns FOREIGN KEY (qname_ns_id) REFERENCES alf_namespace (id),
@@ -273,7 +286,7 @@ CREATE TABLE alf_node_aspects
node_id INT8 NOT NULL,
qname_id INT8 NOT NULL,
PRIMARY KEY (node_id, qname_id),
CONSTRAINT fk_alf_nasp_n FOREIGN KEY (node_id) REFERENCES alf_node (id),
CONSTRAINT fk_alf_nasp_n FOREIGN KEY (node_id) REFERENCES alf_node (id),
CONSTRAINT fk_alf_nasp_qn FOREIGN KEY (qname_id) REFERENCES alf_qname (id)
);
CREATE INDEX fk_alf_nasp_n ON alf_node_aspects (node_id);
@@ -288,7 +301,7 @@ CREATE TABLE alf_node_assoc
target_node_id INT8 NOT NULL,
type_qname_id INT8 NOT NULL,
assoc_index INT8 NOT NULL,
PRIMARY KEY (id),
PRIMARY KEY (id),
CONSTRAINT fk_alf_nass_snode FOREIGN KEY (source_node_id) REFERENCES alf_node (id),
CONSTRAINT fk_alf_nass_tnode FOREIGN KEY (target_node_id) REFERENCES alf_node (id),
CONSTRAINT fk_alf_nass_tqn FOREIGN KEY (type_qname_id) REFERENCES alf_qname (id)

View File

@@ -44,6 +44,7 @@
<sequence name="alf_prop_unique_ctx_seq"/>
<sequence name="alf_prop_value_seq"/>
<sequence name="alf_qname_seq"/>
<sequence name="alf_server_seq"/>
<sequence name="alf_store_seq"/>
<sequence name="alf_transaction_seq"/>
<sequence name="alf_usage_delta_seq"/>
@@ -912,45 +913,45 @@
</index>
</indexes>
</table>
<table name="alf_auth_status">
<columns>
<column name="id" order="1">
<type>int8</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="username" order="2">
<type>varchar(100)</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="deleted" order="3">
<type>bool</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="authorized" order="4">
<type>bool</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="checksum" order="5">
<type>bytea</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="authaction" order="6">
<type>varchar(10)</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
</columns>
<primarykey name="alf_auth_status_pkey">
<columnnames>
<columnname order="1">id</columnname>
</columnnames>
</primarykey>
<indexes>
<table name="alf_auth_status">
<columns>
<column name="id" order="1">
<type>int8</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="username" order="2">
<type>varchar(100)</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="deleted" order="3">
<type>bool</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="authorized" order="4">
<type>bool</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="checksum" order="5">
<type>bytea</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="authaction" order="6">
<type>varchar(10)</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
</columns>
<primarykey name="alf_auth_status_pkey">
<columnnames>
<columnname order="1">id</columnname>
</columnnames>
</primarykey>
<indexes>
<index name="idx_alf_auth_usr_stat" unique="true">
<columnnames>
<columnname>username</columnname>
@@ -959,7 +960,7 @@
</index>
<index name="idx_alf_auth_deleted" unique="false">
<columnnames>
<columnname>deleted</columnname>
<columnname>deleted</columnname>
</columnnames>
</index>
<index name="idx_alf_auth_action" unique="false">
@@ -967,9 +968,9 @@
<columnname>authaction</columnname>
</columnnames>
</index>
</indexes>
<foreignkeys/>
</table>
</indexes>
<foreignkeys/>
</table>
<table name="alf_child_assoc">
<columns>
<column name="id" order="1">
@@ -1628,11 +1629,11 @@
</columns>
<primarykey name="alf_node_pkey">
<validators>
<validator class="org.alfresco.util.schemacomp.validator.NameValidator">
<properties>
<property name="pattern">alf_node_pkey1?</property>
</properties>
</validator>
<validator class="org.alfresco.util.schemacomp.validator.NameValidator">
<properties>
<property name="pattern">alf_node_pkey1?</property>
</properties>
</validator>
</validators>
<columnnames>
<columnname order="1">id</columnname>
@@ -1830,11 +1831,11 @@
</columns>
<primarykey name="alf_node_assoc_pkey">
<validators>
<validator class="org.alfresco.util.schemacomp.validator.NameValidator">
<properties>
<property name="pattern">alf_node_assoc_pkey1?</property>
</properties>
</validator>
<validator class="org.alfresco.util.schemacomp.validator.NameValidator">
<properties>
<property name="pattern">alf_node_assoc_pkey1?</property>
</properties>
</validator>
</validators>
<columnnames>
<columnname order="1">id</columnname>
@@ -2542,7 +2543,39 @@
<columnname>local_name</columnname>
</columnnames>
</index>
</indexes>
</indexes>
</table>
<table name="alf_server">
<columns>
<column name="id" order="1">
<type>int8</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="version" order="2">
<type>int8</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="ip_address" order="3">
<type>varchar(39)</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
</columns>
<primarykey name="alf_server_pkey">
<columnnames>
<columnname order="1">id</columnname>
</columnnames>
</primarykey>
<foreignkeys/>
<indexes>
<index name="ip_address" unique="true">
<columnnames>
<columnname>ip_address</columnname>
</columnnames>
</index>
</indexes>
</table>
<table name="alf_store">
<columns>
@@ -2690,12 +2723,17 @@
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="change_txn_id" order="3">
<column name="server_id" order="3">
<type>int8</type>
<nullable>true</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="change_txn_id" order="4">
<type>varchar(56)</type>
<nullable>false</nullable>
<autoincrement>false</autoincrement>
</column>
<column name="commit_time_ms" order="4">
<column name="commit_time_ms" order="5">
<type>int8</type>
<nullable>true</nullable>
<autoincrement>false</autoincrement>
@@ -2706,7 +2744,19 @@
<columnname order="1">id</columnname>
</columnnames>
</primarykey>
<foreignkeys>
<foreignkey name="fk_alf_txn_svr">
<localcolumn>server_id</localcolumn>
<targettable>alf_server</targettable>
<targetcolumn>id</targetcolumn>
</foreignkey>
</foreignkeys>
<indexes>
<index name="fk_alf_txn_svr" unique="false">
<columnnames>
<columnname>server_id</columnname>
</columnnames>
</index>
<index name="idx_alf_txn_ctms" unique="false">
<columnnames>
<columnname>commit_time_ms</columnname>

View File

@@ -54,7 +54,6 @@
<ref bean="patch.db-V5.1-metadata-query-indexes" />
<ref bean="patch.db-V5.2-remove-jbpm-tables-from-db" />
<ref bean="patch.db-V6.0-change-set-indexes" />
<ref bean="patch.db-V6.3-remove-alf_server-table" />
<ref bean="patch.db-V6.3-add-indexes-node-transaction" />
</list>
</property>

View File

@@ -1,31 +0,0 @@
--
-- Title: Remove alf_server table
-- Database: MySQL
-- Since: V6.3
-- Author: David Edwards
-- Author: Alex Mukha
--
-- Please contact support@alfresco.com if you need assistance with the upgrade.
--
SET FOREIGN_KEY_CHECKS=0;
DROP TABLE alf_server;
ALTER TABLE alf_transaction
DROP FOREIGN KEY fk_alf_txn_svr,
DROP COLUMN server_id;
SET FOREIGN_KEY_CHECKS=1;
--
-- Record script finish
--
DELETE FROM alf_applied_patch WHERE id = 'patch.db-V6.3-remove-alf_server-table';
INSERT INTO alf_applied_patch
(id, description, fixes_from_schema, fixes_to_schema, applied_to_schema, target_schema, applied_on_date, applied_to_server, was_executed, succeeded, report)
VALUES
(
'patch.db-V6.3-remove-alf_server-table', 'Remove alf_server table',
0, 14000, -1, 14001, null, 'UNKNOWN', ${TRUE}, ${TRUE}, 'Script completed'
);

View File

@@ -1,64 +0,0 @@
--
-- Title: Remove alf_server table
-- Database: PostgreSQL
-- Since: V6.3
-- Author: David Edwards
-- Author: Alex Mukha
--
-- Please contact support@alfresco.com if you need assistance with the upgrade.
--
-- DROP the indexes
DROP INDEX fk_alf_txn_svr;
DROP INDEX idx_alf_txn_ctms;
-- DROP the constraints alf_transaction
ALTER TABLE alf_transaction DROP CONSTRAINT fk_alf_txn_svr;
-- Rename existing alf_transaction to t_alf_transaction
ALTER TABLE alf_transaction RENAME TO t_alf_transaction;
-- Create new alf_transaction table with new schema
CREATE TABLE alf_transaction
(
id INT8 NOT NULL,
version INT8 NOT NULL,
change_txn_id VARCHAR(56) NOT NULL,
commit_time_ms INT8,
PRIMARY KEY (id)
);
CREATE INDEX idx_alf_txn_ctms ON alf_transaction (commit_time_ms, id);
--FOREACH t_alf_transaction.id system.upgrade.alf_server_deleted.batchsize
INSERT INTO alf_transaction
(id, version, change_txn_id, commit_time_ms)
(
SELECT
id, version, change_txn_id, commit_time_ms
FROM
t_alf_transaction
WHERE
id >= ${LOWERBOUND} AND id <= ${UPPERBOUND}
);
-- DROP existing fk constraint from alf_node ADD a new reference to the new alf_transaction table
ALTER TABLE alf_node
DROP CONSTRAINT fk_alf_node_txn,
ADD CONSTRAINT fk_alf_node_txn FOREIGN KEY (transaction_id)
REFERENCES alf_transaction (id);
DROP TABLE t_alf_transaction;
DROP TABLE alf_server;
--
-- Record script finish
--
DELETE FROM alf_applied_patch WHERE id = 'patch.db-V6.3-remove-alf_server-table';
INSERT INTO alf_applied_patch
(id, description, fixes_from_schema, fixes_to_schema, applied_to_schema, target_schema, applied_on_date, applied_to_server, was_executed, succeeded, report)
VALUES
(
'patch.db-V6.3-remove-alf_server-table', 'Remove alf_server table',
0, 14000, -1, 14001, null, 'UNKNOWN', ${TRUE}, ${TRUE}, 'Script completed'
);

Some files were not shown because too many files have changed in this diff Show More