Compare commits

...

71 Commits

Author SHA1 Message Date
Travis CI User
afbf52413f [maven-release-plugin][skip ci] prepare release 16.10 2022-07-26 09:23:55 +00:00
mstrankowski
ad67088017 Upping version to 16.10-SNAPSHOT, to avoid collision with already existing tag 2022-07-26 10:41:38 +02:00
mstrankowski
4c29d23ac0 Retrigger build for: Update ServicePack branch to 7.2.2
with empty commit
2022-07-25 22:20:12 +02:00
mstrankowski
ed8b18c576 Update ServicePack branch to 7.2.2 2022-07-25 20:55:41 +02:00
Travis CI User
c9b49789c7 [maven-release-plugin][skip ci] prepare for next development iteration 2022-07-21 00:05:09 +00:00
Travis CI User
d28d4873be [maven-release-plugin][skip ci] prepare release 15.9 2022-07-21 00:05:06 +00:00
Jared Ottley
44d7c2328c [ACS-3320] REST API Explorer 7.2.1 final release 2022-07-20 15:56:56 -06:00
Travis CI User
073338afa7 [maven-release-plugin][skip ci] prepare for next development iteration 2022-07-14 19:50:50 +00:00
Travis CI User
3e53467ac8 [maven-release-plugin][skip ci] prepare release 15.8 2022-07-14 19:50:48 +00:00
pzurek
0d5ffdac2e Merge branch 'release/7.2.N' of github.com:Alfresco/alfresco-community-repo into release/7.2.N 2022-07-14 21:07:07 +02:00
pzurek
ac03eb7642 Revert "Revert "PRODSEC-6115: Bump Surf webscript Version to 8.31 and removing set exception (#1207)""
This reverts commit 3304a62a35.
2022-07-14 21:06:36 +02:00
Travis CI User
2ff5b7dd0a [maven-release-plugin][skip ci] prepare for next development iteration 2022-07-14 17:04:48 +00:00
Travis CI User
b0d7e6dfba [maven-release-plugin][skip ci] prepare release 15.7 2022-07-14 17:04:46 +00:00
pzurek
3304a62a35 Revert "PRODSEC-6115: Bump Surf webscript Version to 8.31 and removing set exception (#1207)"
This reverts commit f48db84334.
2022-07-14 18:20:46 +02:00
Travis CI User
763591c1a3 [maven-release-plugin][skip ci] prepare for next development iteration 2022-07-14 11:24:55 +00:00
Travis CI User
6de5a507fe [maven-release-plugin][skip ci] prepare release 15.6 2022-07-14 11:24:53 +00:00
Damian Ujma
3990bc9db4 ACS-3271 Update MySQL 5.7.23 tests to 5.7.28 version (#1209)
* PRODSEC-6261 Add 'shouldNotGetProcessesByNotInvolvedUser' test

* PRODSEC-6261 Add user validation to 'getProcess' method

* PRODSEC-6261 Add TestRail annotation minor fix

* Update MySQL 5.7.23 tests to MySQL 7.7.28
2022-07-14 12:43:46 +02:00
Damian Ujma
f2207fe43e ACS-3150 Upgrade jackson and gson libraries - Backporting (#1211)
* PRODSEC-6261 Add 'shouldNotGetProcessesByNotInvolvedUser' test

* PRODSEC-6261 Add user validation to 'getProcess' method

* PRODSEC-6261 Add TestRail annotation minor fix

* ACS-3150 Upgrade jackson and gson libraries (#1151)

(cherry picked from commit f1a3aa696e)

Co-authored-by: Piotr Żurek <Piotr.Zurek@hyland.com>
2022-07-13 15:23:30 +02:00
rrajoria
f48db84334 PRODSEC-6115: Bump Surf webscript Version to 8.31 and removing set exception (#1207)
* PRODSEC-6115: Bump Surf webscript Version to 8.31

* Update NodeBrowserScript.java
2022-07-12 17:20:22 +05:30
Damian Ujma
98d73b7200 PRODSEC-6261 Fix workflow api bola - Backporting (#1206)
* PRODSEC-6261 Add 'shouldNotGetProcessesByNotInvolvedUser' test

* PRODSEC-6261 Add user validation to 'getProcess' method

* PRODSEC-6261 Add TestRail annotation minor fix
2022-07-12 13:12:15 +02:00
Travis CI User
acd4b1efcb [maven-release-plugin][skip ci] prepare for next development iteration 2022-05-18 09:14:26 +00:00
Travis CI User
4433dd009a [maven-release-plugin][skip ci] prepare release 15.5 2022-05-18 09:14:23 +00:00
mikolajbrzezinski
d1a9794ec8 MTN-22905 Fix case sensitivity issues on people search
MTN-22905 Fix case sensitivity issues on people search backport

* useCQ = true

* useCQ back to original

* useCQ = true

* Copyright Update

* useCQ restored, Javascrpit changed

* Javascript changes to filter

* PR comments requested change

* Revert "PR comments requested change"

This reverts commit 0673b6c3ff.

* Revert "useCQ restored, Javascrpit changed"

This reverts commit 00b79b5aca.

* Revert "Copyright Update"

This reverts commit 76d1f1c005.

* Revert "useCQ = true"

This reverts commit 215ad952f5.

* Revert "useCQ back to original"

This reverts commit deb5e82218.

* Revert "useCQ = true"

This reverts commit 115910ffc1.

* test change

* Initial changes

* Further changes

* Space deleted

* jobtitle search

* Restore check sorting and mock

* Avoid null [hint:useCQ]

* Wrong sign

* Fix

* Clean up

* Initial changes

* Rename Method

(cherry picked from commit 1ccb8a2164)
2022-05-18 10:27:53 +02:00
alandavis
bf848ff882 Revert "ACS-2864 Use maven props in AGS test version.properties so we don't have to update the value"
And correct the value to 2.7.1

This reverts commit 69ebccfc20.
2022-05-09 17:13:19 +01:00
alandavis
69ebccfc20 ACS-2864 Use maven props in AGS test version.properties so we don't have to update the value
(cherry picked from commit 4a4bb2de02)
(cherry picked from commit b36e21ad04)
2022-05-09 16:24:02 +01:00
Travis CI User
600b50fce1 [maven-release-plugin][skip ci] prepare for next development iteration 2022-05-03 10:25:04 +00:00
Travis CI User
37e8586658 [maven-release-plugin][skip ci] prepare release 15.4 2022-05-03 10:25:01 +00:00
evasques
4b12ed5a51 MNT-22968 - Bump Freemarker (#1094) 2022-05-03 08:41:14 +01:00
Travis CI User
e379b7704d [maven-release-plugin][skip ci] prepare for next development iteration 2022-04-07 16:28:45 +00:00
Travis CI User
297be122a6 [maven-release-plugin][skip ci] prepare release 15.3 2022-04-07 16:28:42 +00:00
Vítor Moreira
ca28024ad8 Fix/mnt 22946 spring rce databind jdk9 72 n (#1055)
* Bump dependency.webscripts.version from 8.28 to 8.29 (#1052)

(cherry picked from commit 22a0343c41)

* MNT-22946: bump spring version to 5.3.18 (#1054)

(cherry picked from commit 53777cd5b9)

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-04-07 16:40:27 +01:00
Travis CI User
cfd3255aa7 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-21 18:14:39 +00:00
Travis CI User
4e436160cc [maven-release-plugin][skip ci] prepare release 15.2 2022-03-21 18:14:36 +00:00
alandavis
bf0ca4ca83 Set acs.version.revision versions 7.2.1 2022-03-21 17:09:20 +00:00
Travis CI User
e23a97960f [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-21 11:23:53 +00:00
Travis CI User
2d2371a792 [maven-release-plugin][skip ci] prepare release 15.1 2022-03-21 11:23:50 +00:00
alandavis
0ea69dd4ef Create release/7.2.N branch 2022-03-21 10:11:14 +00:00
Travis CI User
5ecd4c2593 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-16 19:34:33 +00:00
Travis CI User
b4a2e2d8cf [maven-release-plugin][skip ci] prepare release 14.145 2022-03-16 19:34:30 +00:00
alandavis
bdf4fd7e16 Build again with replacement alfresco.aos-module 1.4.1 2022-03-16 18:48:07 +00:00
Travis CI User
e3b3e4b099 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-16 10:07:00 +00:00
Travis CI User
895ab9dbbf [maven-release-plugin][skip ci] prepare release 14.144 2022-03-16 10:06:57 +00:00
dependabot[bot]
85054a7649 Bump license-maven-plugin from 2.0.1.alfresco-1 to 2.0.1.alfresco-2 (#1021)
Bumps [license-maven-plugin](https://github.com/mojohaus/license-maven-plugin) from 2.0.1.alfresco-1 to 2.0.1.alfresco-2.
- [Release notes](https://github.com/mojohaus/license-maven-plugin/releases)
- [Commits](https://github.com/mojohaus/license-maven-plugin/commits)

---
updated-dependencies:
- dependency-name: org.codehaus.mojo:license-maven-plugin
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-16 09:21:46 +00:00
Travis CI User
c60a182b89 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-15 16:43:19 +00:00
Travis CI User
94ba463db8 [maven-release-plugin][skip ci] prepare release 14.143 2022-03-15 16:43:16 +00:00
Domenico Sibilio
cf67b6791a ACS-2307 Update IE/SS to 2.0.3 2022-03-15 16:25:31 +01:00
Travis CI User
506441a7ec [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-14 14:21:41 +00:00
Travis CI User
9b01a3fa1f [maven-release-plugin][skip ci] prepare release 14.142 2022-03-14 14:21:37 +00:00
Piotr Żurek
3dc00da2a5 ACS-2658 Fix missing metadata extraction while uploading new version (#1016) 2022-03-14 14:37:47 +01:00
Travis CI User
c20bdd754f [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-13 11:52:06 +00:00
Travis CI User
d3a52be71a [maven-release-plugin][skip ci] prepare release 14.141 2022-03-13 11:52:04 +00:00
Domenico Sibilio
07c3ca5bab ACS-2307 Update IE/SS to 2.0.3-RC5 2022-03-13 12:08:48 +01:00
Travis CI User
e593a17e88 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-12 18:10:01 +00:00
Travis CI User
1975729174 [maven-release-plugin][skip ci] prepare release 14.140 2022-03-12 18:09:58 +00:00
Andrea Ligios
1491a9a7dd ACS-2664 - Bump Alfresco Messaging Repo to 1.2.19 2022-03-12 18:19:54 +01:00
Andrea Ligios
3923560588 ACS-2664 - Bump Alfresco Messaging Repo to 1.2.18 2022-03-12 16:59:29 +01:00
Andrea Ligios
af1aa6528b Bumped alfresco-messaging-repo to 1.2.17
To cope with the `camel-spring-xml` split from `camel-spring` in Camel 3.9
2022-03-11 18:01:49 +01:00
Nithin Nambiar
8cf9cd3ed5 ACS-1601 Node cleanup job improvements for Postgres 2022-03-11 17:58:22 +01:00
Travis CI User
b2dd06eef8 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-11 11:36:52 +00:00
Travis CI User
8d46151e41 [maven-release-plugin][skip ci] prepare release 14.139 2022-03-11 11:36:49 +00:00
alandavis
3d5166b5d2 ACS-2669 Upgrade to t-core 2.5.7 2022-03-11 10:19:52 +00:00
Travis CI User
847af44db0 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-10 14:34:55 +00:00
Travis CI User
66cd9d4194 [maven-release-plugin][skip ci] prepare release 14.138 2022-03-10 14:34:52 +00:00
Domenico Sibilio
a26eeef847 ACS-2307 Update SS/IE to 2.0.3-RC4 2022-03-10 14:43:20 +01:00
Travis CI User
7ba414eff9 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-10 12:44:42 +00:00
Travis CI User
55abd66da6 [maven-release-plugin][skip ci] prepare release 14.137 2022-03-10 12:44:39 +00:00
Piotr Żurek
6ed43a9a87 ACS-2318 Upgrade tomcat base docker images (#1012) 2022-03-10 12:54:32 +01:00
Travis CI User
e657c2a1f7 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-09 09:48:16 +00:00
Travis CI User
31aa55366f [maven-release-plugin][skip ci] prepare release 14.136 2022-03-09 09:48:13 +00:00
alandavis
c7f1f808a1 Upgrade to t-core 2.5.7-A11 2022-03-09 08:57:51 +00:00
Travis CI User
40b537b589 [maven-release-plugin][skip ci] prepare for next development iteration 2022-03-08 18:58:18 +00:00
64 changed files with 1995 additions and 1174 deletions

View File

@@ -60,7 +60,7 @@ jobs:
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.3 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7-A10
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContext01TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - AppContext02TestSuite"
@@ -75,7 +75,7 @@ jobs:
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.3 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7-A10
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContext03TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - AppContext04TestSuite"
@@ -83,7 +83,7 @@ jobs:
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.3 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7-A10
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContext04TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - AppContext05TestSuite"
@@ -102,7 +102,7 @@ jobs:
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.3 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7-A10
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContext06TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - AppContextExtraTestSuite"
@@ -110,7 +110,7 @@ jobs:
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.3 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7-A10
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7
script: travis_wait 20 mvn -B test -pl repository -Dtest=AppContextExtraTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - MiscContextTestSuite"
@@ -118,7 +118,7 @@ jobs:
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.3 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7-A10
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7
script: travis_wait 20 mvn -B test -pl repository -Dtest=MiscContextTestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Repository - SearchTestSuite"
@@ -157,10 +157,10 @@ jobs:
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.name=alfresco -Ddb.url=jdbc:mariadb://localhost:3307/alfresco?useUnicode=yes\&characterEncoding=UTF-8 -Ddb.username=alfresco -Ddb.password=alfresco -Ddb.driver=org.mariadb.jdbc.Driver
- name: "Repository - MySQL 5.7.23 tests"
- name: "Repository - MySQL 5.7.28 tests"
if: (branch =~ /(release\/.*$|master)/ AND commit_message !~ /\[skip db\]/ AND type != pull_request) OR commit_message =~ /\[db\]/
before_script:
- docker run -d -p 3307:3306 -e MYSQL_ROOT_PASSWORD=alfresco -e MYSQL_USER=alfresco -e MYSQL_DATABASE=alfresco -e MYSQL_PASSWORD=alfresco mysql:5.7.23 --transaction-isolation='READ-COMMITTED' --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
- docker run -d -p 3307:3306 -e MYSQL_ROOT_PASSWORD=alfresco -e MYSQL_USER=alfresco -e MYSQL_DATABASE=alfresco -e MYSQL_PASSWORD=alfresco mysql:5.7.28 --transaction-isolation='READ-COMMITTED' --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
script: travis_wait 20 mvn -B test -pl repository -Dtest=AllDBTestsTestSuite -Ddb.driver=com.mysql.jdbc.Driver -Ddb.name=alfresco -Ddb.url=jdbc:mysql://localhost:3307/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
@@ -241,7 +241,7 @@ jobs:
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.3 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7-A10
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7
script: travis_wait 20 mvn -B test -pl remote-api -Dtest=AppContext02TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Remote-api - AppContext03TestSuite"
@@ -249,7 +249,7 @@ jobs:
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.3 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7-A10
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7
script: travis_wait 20 mvn -B test -pl remote-api -Dtest=AppContext03TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Remote-api - AppContext04TestSuite"
@@ -257,7 +257,7 @@ jobs:
before_script:
- docker run -d -p 5433:5432 -e POSTGRES_PASSWORD=alfresco -e POSTGRES_USER=alfresco -e POSTGRES_DB=alfresco postgres:13.3 postgres -c 'max_connections=300'
- docker run -d -p 61616:61616 -p 5672:5672 alfresco/alfresco-activemq:5.16.1
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7-A10
- docker run -d -p 8090:8090 -e JAVA_OPTS=" -Xms256m -Xmx256m" alfresco/alfresco-transform-core-aio:2.5.7
script: travis_wait 20 mvn -B test -pl remote-api -Dtest=AppContext04TestSuite -Ddb.driver=org.postgresql.Driver -Ddb.name=alfresco -Ddb.url=jdbc:postgresql://localhost:5433/alfresco -Ddb.username=alfresco -Ddb.password=alfresco
- name: "Remote-api - AppContextExtraTestSuite"

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-amps</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<modules>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-governance-services-community-parent</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<modules>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-governance-services-automation-community-repo</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<build>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-governance-services-community-parent</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<modules>

View File

@@ -1,4 +1,4 @@
TRANSFORMERS_TAG=2.5.7-A10
SOLR6_TAG=2.0.3-RC3
TRANSFORMERS_TAG=2.5.7
SOLR6_TAG=2.0.3
POSTGRES_TAG=13.3
ACTIVEMQ_TAG=5.16.1

View File

@@ -8,7 +8,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-governance-services-community-repo-parent</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<properties>

View File

@@ -4,8 +4,8 @@
# Version label
version.major=7
version.minor=0
version.revision=0
version.minor=2
version.revision=2
version.label=
# Edition label
@@ -15,4 +15,4 @@ version.edition=Community
version.scmrevision=@scm-path@-r@scm-revision@
# Build number
version.build=r@scm-revision@-b@build-number@
version.build=r@scm-revision@-b@build-number@

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-governance-services-community-repo-parent</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<build>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<modules>

View File

@@ -8,7 +8,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-amps</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<properties>

View File

@@ -176,7 +176,6 @@ public class NodeBrowserScript extends NodeBrowserPost implements Serializable
{
status.setCode(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
status.setMessage(e.getMessage());
status.setException(e);
status.setRedirect(true);
}
return tmplMap;

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<dependencies>
@@ -128,8 +128,8 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<groupId>org.apache.commons</groupId>
<artifactId>commons-dbcp2</artifactId>
<scope>test</scope>
</dependency>
</dependencies>

View File

@@ -21,7 +21,7 @@ package org.alfresco.util.transaction;
import org.alfresco.error.AlfrescoRuntimeException;
/**
* Exception wraps {@link java.util.NoSuchElementException} from {@link org.apache.commons.dbcp.BasicDataSource}
* Exception wraps {@link java.util.NoSuchElementException} from {@link org.apache.commons.dbcp2.BasicDataSource}
*
* @author alex.mukha
* @since 4.1.9

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<properties>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<dependencies>

View File

@@ -9,6 +9,6 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-packaging</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
</project>

View File

@@ -1,6 +1,6 @@
# Fetch image based on Tomcat 9.0, Java 11 and Centos 7
# More infos about this image: https://github.com/Alfresco/alfresco-docker-base-tomcat
FROM alfresco/alfresco-base-tomcat:tomcat9-jre11-centos7-202203071500
FROM alfresco/alfresco-base-tomcat:tomcat9-jre11-centos7-202203091924
# Set default docker_context.
ARG resource_path=target

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-packaging</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<properties>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<modules>

View File

@@ -1,4 +1,4 @@
TRANSFORMERS_TAG=2.5.7-A10
SOLR6_TAG=2.0.3-RC3
TRANSFORMERS_TAG=2.5.7
SOLR6_TAG=2.0.3
POSTGRES_TAG=13.3
ACTIVEMQ_TAG=5.16.1

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-packaging</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<modules>

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<developers>

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<developers>

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<developers>

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<developers>

View File

@@ -16,7 +16,7 @@ import org.testng.annotations.Test;
*/
public class GetProcessSanityTests extends RestTest
{
private UserModel userWhoStartsProcess, assignee;
private UserModel userWhoStartsProcess, assignee, user;
private RestProcessModel addedProcess, process;
@BeforeClass(alwaysRun = true)
@@ -24,6 +24,7 @@ public class GetProcessSanityTests extends RestTest
{
userWhoStartsProcess = dataUser.createRandomTestUser();
assignee = dataUser.createRandomTestUser();
user = dataUser.createRandomTestUser();
addedProcess = restClient.authenticateUser(userWhoStartsProcess).withWorkflowAPI().addProcess("activitiAdhoc", assignee, false, CMISUtil.Priority.High);
}
@@ -59,4 +60,13 @@ public class GetProcessSanityTests extends RestTest
process.assertThat().field("id").is(addedProcess.getId())
.and().field("startUserId").is(addedProcess.getStartUserId());
}
@TestRail(section = { TestGroup.REST_API, TestGroup.PROCESSES }, executionType = ExecutionType.SANITY,
description = "Verify User that is not involved in a process cannot get that process using REST API and status code is FORBIDDEN (403)")
@Test(groups = { TestGroup.REST_API, TestGroup.WORKFLOW, TestGroup.PROCESSES, TestGroup.SANITY })
public void shouldNotGetProcessesByNotInvolvedUser() throws Exception
{
process = restClient.authenticateUser(user).withWorkflowAPI().usingProcess(addedProcess).getProcess();
restClient.assertStatusCodeIs(HttpStatus.FORBIDDEN);
}
}

View File

@@ -9,7 +9,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-tests</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<developers>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo-packaging</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<properties>

32
pom.xml
View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>alfresco-community-repo</artifactId>
<version>14.135</version>
<version>16.10</version>
<packaging>pom</packaging>
<name>Alfresco Community Repo Parent</name>
@@ -25,7 +25,7 @@
<properties>
<acs.version.major>7</acs.version.major>
<acs.version.minor>2</acs.version.minor>
<acs.version.revision>0</acs.version.revision>
<acs.version.revision>2</acs.version.revision>
<acs.version.label />
<amp.min.version>${acs.version.major}.0.0</amp.min.version>
@@ -47,7 +47,7 @@
<dependency.alfresco-trashcan-cleaner.version>2.4.1</dependency.alfresco-trashcan-cleaner.version>
<dependency.alfresco-jlan.version>7.1</dependency.alfresco-jlan.version>
<dependency.alfresco-server-root.version>6.0.1</dependency.alfresco-server-root.version>
<dependency.alfresco-messaging-repo.version>1.2.15</dependency.alfresco-messaging-repo.version>
<dependency.alfresco-messaging-repo.version>1.2.19</dependency.alfresco-messaging-repo.version>
<dependency.alfresco-log-sanitizer.version>0.2</dependency.alfresco-log-sanitizer.version>
<dependency.activiti-engine.version>5.23.0</dependency.activiti-engine.version>
<dependency.activiti.version>5.23.0</dependency.activiti.version>
@@ -55,19 +55,19 @@
<dependency.alfresco-greenmail.version>6.2</dependency.alfresco-greenmail.version>
<dependency.acs-event-model.version>0.0.13</dependency.acs-event-model.version>
<dependency.spring.version>5.3.15</dependency.spring.version>
<dependency.spring.version>5.3.18</dependency.spring.version>
<dependency.antlr.version>3.5.2</dependency.antlr.version>
<dependency.jackson.version>2.13.1</dependency.jackson.version>
<dependency.jackson-databind.version>2.13.1</dependency.jackson-databind.version>
<dependency.jackson.version>2.13.3</dependency.jackson.version>
<dependency.jackson-databind.version>2.13.3</dependency.jackson-databind.version>
<dependency.cxf.version>3.5.0</dependency.cxf.version>
<dependency.opencmis.version>1.0.0</dependency.opencmis.version>
<dependency.webscripts.version>8.28</dependency.webscripts.version>
<dependency.webscripts.version>8.31</dependency.webscripts.version>
<dependency.bouncycastle.version>1.70</dependency.bouncycastle.version>
<dependency.mockito-core.version>3.11.2</dependency.mockito-core.version>
<dependency.org-json.version>20211205</dependency.org-json.version>
<dependency.commons-dbcp.version>1.4-DBCP330</dependency.commons-dbcp.version>
<dependency.commons-dbcp.version>2.9.0</dependency.commons-dbcp.version>
<dependency.commons-io.version>2.11.0</dependency.commons-io.version>
<dependency.gson.version>2.8.5</dependency.gson.version>
<dependency.gson.version>2.8.9</dependency.gson.version>
<dependency.httpclient.version>4.5.13</dependency.httpclient.version>
<dependency.httpcore.version>4.4.15</dependency.httpcore.version>
<dependency.commons-httpclient.version>3.1-HTTPCLIENT-1265</dependency.commons-httpclient.version>
@@ -75,7 +75,7 @@
<dependency.slf4j.version>1.7.35</dependency.slf4j.version>
<dependency.gytheio.version>0.16</dependency.gytheio.version>
<dependency.groovy.version>3.0.9</dependency.groovy.version>
<dependency.tika.version>2.2.1</dependency.tika.version>
<dependency.tika.version>2.2.1</dependency.tika.version>
<dependency.spring-security.version>5.6.1</dependency.spring-security.version>
<dependency.truezip.version>7.7.10</dependency.truezip.version>
<dependency.poi.version>4.1.2</dependency.poi.version>
@@ -106,11 +106,11 @@
<dependency.jakarta-rpc-api.version>1.1.4</dependency.jakarta-rpc-api.version>
<alfresco.googledrive.version>3.2.1.3</alfresco.googledrive.version>
<alfresco.aos-module.version>1.4.1</alfresco.aos-module.version>
<alfresco.api-explorer.version>7.2.0</alfresco.api-explorer.version> <!-- Also in alfresco-enterprise-share -->
<alfresco.aos-module.version>1.4.1</alfresco.aos-module.version>
<alfresco.api-explorer.version>7.2.1</alfresco.api-explorer.version> <!-- Also in alfresco-enterprise-share -->
<alfresco.maven-plugin.version>2.2.0</alfresco.maven-plugin.version>
<license-maven-plugin.version>2.0.1.alfresco-1</license-maven-plugin.version>
<license-maven-plugin.version>2.0.1.alfresco-2</license-maven-plugin.version>
<dependency.postgresql.version>42.3.2</dependency.postgresql.version>
<dependency.mysql.version>8.0.27</dependency.mysql.version>
@@ -146,7 +146,7 @@
<connection>scm:git:https://github.com/Alfresco/alfresco-community-repo.git</connection>
<developerConnection>scm:git:https://github.com/Alfresco/alfresco-community-repo.git</developerConnection>
<url>https://github.com/Alfresco/alfresco-community-repo</url>
<tag>14.135</tag>
<tag>16.10</tag>
</scm>
<distributionManagement>
@@ -755,8 +755,8 @@
<version>${dependency.mockito-core.version}</version>
</dependency>
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<groupId>org.apache.commons</groupId>
<artifactId>commons-dbcp2</artifactId>
<version>${dependency.commons-dbcp.version}</version>
</dependency>
<dependency>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<dependencies>

View File

@@ -511,7 +511,9 @@ public class ProcessesImpl extends WorkflowRestImpl implements Processes
{
throw new InvalidArgumentException("processId is required to get the process info");
}
validateIfUserAllowedToWorkWithProcess(processId);
HistoricProcessInstance processInstance = activitiProcessEngine
.getHistoryService()
.createHistoricProcessInstanceQuery()

View File

@@ -3,6 +3,7 @@ function main()
// Get the args
var filter = args["filter"];
if (filter!==null && !filter.includes(":")) {filter += " [hint:useCQ]";}
var maxResults = args["maxResults"];
var skipCountStr = args["skipCount"];
var skipCount = skipCountStr != null ? parseInt(skipCountStr) : -1;

View File

@@ -3,7 +3,7 @@ function main()
// Get the args
var siteShortName = url.templateArgs.shortname,
site = siteService.getSite(siteShortName),
filter = (args.filter != null) ? args.filter : (args.shortNameFilter != null) ? args.shortNameFilter : "",
filter = ((args.filter != null) ? args.filter : (args.shortNameFilter != null) ? args.shortNameFilter : "" )+ " [hint:useCQ]",
maxResults = (args.maxResults == null) ? 10 : parseInt(args.maxResults, 10),
authorityType = args.authorityType,
zone = args.zone,

View File

@@ -357,12 +357,13 @@ function main()
updateNode.properties.content.write(content, updateNameAndMimetype, true, newFilename);
// check it in again, with supplied version history note
updateNode = updateNode.checkin(description, majorVersion);
// Extract the metadata
// (The overwrite policy controls which if any parts of
// the document's properties are updated from this)
extractMetadata(updateNode);
updateNode = updateNode.checkin(description, majorVersion);
if (aspects.length != 0)
{
for (i = 0; i < aspects.length; i++)

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.alfresco</groupId>
<artifactId>alfresco-community-repo</artifactId>
<version>14.135</version>
<version>16.10</version>
</parent>
<dependencies>
@@ -77,8 +77,8 @@
</dependency>
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<groupId>org.apache.commons</groupId>
<artifactId>commons-dbcp2</artifactId>
</dependency>
<dependency>
<groupId>commons-fileupload</groupId>
@@ -236,7 +236,7 @@
<dependency>
<groupId>org.freemarker</groupId>
<artifactId>freemarker</artifactId>
<version>2.3.20-alfresco-patched-20200421</version>
<version>2.3.20-alfresco-patched-20220413</version>
</dependency>
<dependency>
<groupId>org.apache.xmlbeans</groupId>

View File

@@ -51,7 +51,7 @@ import org.alfresco.service.cmr.workflow.WorkflowAdminService;
import org.alfresco.service.transaction.TransactionService;
import org.alfresco.traitextender.SpringExtensionBundle;
import org.alfresco.util.PropertyCheck;
import org.apache.commons.dbcp.BasicDataSource;
import org.apache.commons.dbcp2.BasicDataSource;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.InitializingBean;
@@ -89,7 +89,7 @@ import javax.sql.DataSource;
* </li>
* <li><b>db:</b> Database configuration
* <ul>
* <li>maxConnections: int - The maximum number of active connections. {@link BasicDataSource#getMaxActive()}</li>
* <li>maxConnections: int - The maximum number of active connections. {@link BasicDataSource#getMaxTotal()}</li>
* </ul>
* </li>
* <li><b>authentication</b>: Authentication configuration.
@@ -326,7 +326,7 @@ public class ConfigurationDataCollector extends HBBaseDataCollector implements I
if (dataSource instanceof BasicDataSource)
{
Map<String, Object> db = new HashMap<>();
db.put("maxConnections", ((BasicDataSource) dataSource).getMaxActive());
db.put("maxConnections", ((BasicDataSource) dataSource).getMaxTotal());
configurationValues.put("db", db);
}

View File

@@ -31,7 +31,7 @@ import org.alfresco.heartbeat.datasender.HBData;
import org.alfresco.heartbeat.jobs.HeartBeatJobScheduler;
import org.alfresco.repo.descriptor.DescriptorDAO;
import org.alfresco.util.PropertyCheck;
import org.apache.commons.dbcp.BasicDataSource;
import org.apache.commons.dbcp2.BasicDataSource;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.InitializingBean;

View File

@@ -35,6 +35,7 @@ import java.util.Collections;
import java.util.Date;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.LinkedList;
import java.util.List;
import java.util.Locale;

View File

@@ -29,6 +29,7 @@ import java.io.Serializable;
import java.util.Collection;
import java.util.Date;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Locale;
import java.util.Map;
@@ -944,4 +945,49 @@ public interface NodeDAO extends NodeBulkLoader
*/
public Long getNextTxCommitTime(Long fromCommitTime);
/**
*
* @param maxCommitTime
* @return Iterator over node ids
*/
default public Iterator<Long> selectDeletedNodesByCommitTime(long maxCommitTime)
{
throw new UnsupportedOperationException("Not Implemented");
}
/**
* Purge the nodes marked as deleted
* @param minAge
* @param deleteBatchSize
* @return the count of nodes deleted in each batch
*/
default public List<String> purgeDeletedNodes(long minAge, int deleteBatchSize)
{
throw new UnsupportedOperationException("This operation is not supported");
}
/**
*
* @param maxCommitTime
* @return Iterator over transaction ids
*/
default public Iterator<Long> selectUnusedTransactionsByCommitTime(long maxCommitTime)
{
throw new UnsupportedOperationException("Not Implemented");
}
/**
* Purge the transactions of purged nodes
* @param minAge
* @param deleteBatchSize
* @return the count of transactions deleted in each batch
*/
default public List<String> purgeEmptyTransactions(long minAge, int deleteBatchSize)
{
throw new UnsupportedOperationException("This operation is not supported");
}
}

View File

@@ -25,16 +25,6 @@
*/
package org.alfresco.repo.domain.node.ibatis;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.SortedSet;
import org.alfresco.error.AlfrescoRuntimeException;
import org.alfresco.ibatis.IdsEntity;
import org.alfresco.model.ContentModel;
@@ -65,6 +55,7 @@ import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.repository.StoreRef;
import org.alfresco.service.namespace.QName;
import org.alfresco.util.Pair;
import org.apache.ibatis.cursor.Cursor;
import org.apache.ibatis.executor.result.DefaultResultContext;
import org.apache.ibatis.session.ResultContext;
import org.apache.ibatis.session.ResultHandler;
@@ -72,6 +63,17 @@ import org.apache.ibatis.session.RowBounds;
import org.mybatis.spring.SqlSessionTemplate;
import org.springframework.util.Assert;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.SortedSet;
/**
* iBatis-specific extension of the Node abstract DAO
@@ -166,6 +168,12 @@ public class NodeDAOImpl extends AbstractNodeDAOImpl
private static final String SELECT_TXN_MIN_TX_ID_IN_NODE_IDRANGE = "alfresco.node.select_TxnMinTxIdInNodeIdRange";
private static final String SELECT_TXN_MAX_TX_ID_IN_NODE_IDRANGE = "alfresco.node.select_TxnMaxTxIdInNodeIdRange";
private static final String SELECT_TXN_NEXT_TXN_COMMIT_TIME = "select_TxnNextTxnCommitTime";
private static final String SELECT_NODES_DELETED_BY_TXN_COMMIT_TIME = "alfresco.node.select.select_Deleted_NodesByTxnCommitTime";
private static final String DELETE_NODES_BY_ID = "alfresco.node.delete_NodesById";
private static final String DELETE_NODE_PROPS_BY_NODE_ID = "alfresco.node.delete_NodePropsByNodeId";
private static final String SELECT_TXNS_UNUSED_BY_TXN_COMMIT_TIME = "alfresco.node.select.select_Txns_UnusedByTxnCommitTime";
private static final String DELETE_TXNS_UNUSED_BY_ID = "alfresco.node.delete_Txns_UnusedById";
protected QNameDAO qnameDAO;
protected DictionaryService dictionaryService;
@@ -1794,8 +1802,138 @@ public class NodeDAOImpl extends AbstractNodeDAOImpl
return template.selectOne(SELECT_TXN_NEXT_TXN_COMMIT_TIME, fromCommitTimeEntity);
}
public Iterator<Long> selectDeletedNodesByCommitTime(long maxCommitTime)
{
// Get the deleted nodes
Pair<Long, QName> deletedTypePair = qnameDAO.getQName(ContentModel.TYPE_DELETED);
if (deletedTypePair == null)
{
// Nothing to do
return null;
}
TransactionQueryEntity transactionQueryEntity = new TransactionQueryEntity();
transactionQueryEntity.setMaxCommitTime(maxCommitTime);
transactionQueryEntity.setTypeQNameId(deletedTypePair.getFirst());
Cursor<Long> cursor = template.selectCursor(SELECT_NODES_DELETED_BY_TXN_COMMIT_TIME, transactionQueryEntity);
return cursor.iterator();
}
public Iterator<Long> selectUnusedTransactionsByCommitTime(long maxCommitTime)
{
TransactionQueryEntity maxCommitTimeEntity = new TransactionQueryEntity();
maxCommitTimeEntity.setMaxCommitTime(maxCommitTime);
Cursor<Long> cursor = template.selectCursor(SELECT_TXNS_UNUSED_BY_TXN_COMMIT_TIME, maxCommitTimeEntity);
return cursor.iterator();
}
@Override
public List<String> purgeDeletedNodes(long minAge, int deleteBatchSize)
{
final long maxCommitTime = System.currentTimeMillis() - minAge;
Iterator<Long> nodeIdIterator = this.selectDeletedNodesByCommitTime(maxCommitTime);
ArrayList<Long> nodeIdList = new ArrayList<>();
List<String> deleteResult = new ArrayList<>();
if (isDebugEnabled)
{
logger.debug("nodes selected for deletion, deleteBatchSize:" + deleteBatchSize);
}
while (nodeIdIterator != null && nodeIdIterator.hasNext())
{
if (deleteBatchSize == nodeIdList.size())
{
int count = deleteSelectedNodesAndProperties(nodeIdList);
if (isDebugEnabled)
{
logger.debug("nodes deleted:" + count);
}
deleteResult.add("Purged old nodes: " + count);
nodeIdList.clear();
}
else
{
nodeIdList.add(nodeIdIterator.next());
}
}
if (nodeIdList.size() > 0)
{
int count = deleteSelectedNodesAndProperties(nodeIdList);
if (isDebugEnabled)
{
logger.debug("remaining nodes deleted:" + count);
}
deleteResult.add("Purged old nodes: " + count);
nodeIdList.clear();
}
return deleteResult;
}
public List<String> purgeEmptyTransactions(long minAge, int deleteBatchSize)
{
final long maxCommitTime = System.currentTimeMillis() - minAge;
Iterator<Long> transactionIdIterator = this.selectUnusedTransactionsByCommitTime(maxCommitTime);
ArrayList<Long> transactionIdList = new ArrayList<>();
List<String> deleteResult = new ArrayList<>();
if (isDebugEnabled)
{
logger.debug("transactions selected for deletion, deleteBatchSize:" + deleteBatchSize);
}
while (transactionIdIterator.hasNext())
{
if (deleteBatchSize == transactionIdList.size())
{
int count = deleteSelectedTransactions(transactionIdList);
deleteResult.add("Purged old transactions: " + count);
if (isDebugEnabled)
{
logger.debug("transactions deleted:" + count);
}
transactionIdList.clear();
}
else
{
transactionIdList.add(transactionIdIterator.next());
}
}
if (transactionIdList.size() > 0)
{
int count = deleteSelectedTransactions(transactionIdList);
deleteResult.add("Purged old transactions: " + count);
if (isDebugEnabled)
{
logger.debug("final batch of transactions deleted:" + count);
}
transactionIdList.clear();
}
return deleteResult;
}
private int deleteSelectedNodesAndProperties(List<Long> nodeIdList)
{
int cnt = template.delete(DELETE_NODE_PROPS_BY_NODE_ID, nodeIdList);
if (isDebugEnabled)
{
logger.debug("nodes props deleted:" + cnt);
}
// Finally, remove the nodes
cnt = template.delete(DELETE_NODES_BY_ID, nodeIdList);
if (isDebugEnabled)
{
logger.debug("nodes deleted:" + cnt);
}
return cnt;
}
private int deleteSelectedTransactions(List<Long> transactionIdList)
{
return template.delete(DELETE_TXNS_UNUSED_BY_ID, transactionIdList);
}
/*
* DAO OVERRIDES
*/
@@ -1936,4 +2074,4 @@ public class NodeDAOImpl extends AbstractNodeDAOImpl
assocQName);
}
}
}
}

View File

@@ -50,6 +50,13 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
// of the chunk (in ms). Default is a couple of hours.
private int purgeSize = 7200000; // ms
//to determine if we need a time based window deletion of nodes or in fixed size batches.
private String algorithm;
private int deleteBatchSize;
private static final String NODE_TABLE_CLEANER_ALG_V2 = "V2";
/**
* Default constructor
*/
@@ -67,15 +74,57 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
{
return Collections.singletonList("Minimum purge age is negative; purge disabled");
}
List<String> purgedNodes = purgeOldDeletedNodes(minPurgeAgeMs);
List<String> purgedTxns = purgeOldEmptyTransactions(minPurgeAgeMs);
List<String> allResults = new ArrayList<String>(100);
List<String> purgedNodes, purgedTxns;
if (NODE_TABLE_CLEANER_ALG_V2.equals(algorithm))
{
refreshLock();
if (logger.isDebugEnabled())
{
logger.debug("DeletedNodeCleanupWorker using batch deletion: About to execute the clean up nodes ");
}
purgedNodes = purgeOldDeletedNodesV2(minPurgeAgeMs);
if (logger.isDebugEnabled())
{
logger.debug(purgedNodes);
}
refreshLock();
if (logger.isDebugEnabled())
{
logger.debug("DeletedNodeCleanupWorker: About to execute the clean up txns ");
}
purgedTxns = purgeOldEmptyTransactionsV2(minPurgeAgeMs);
}
else
{
if (logger.isDebugEnabled())
{
logger.debug("DeletedNodeCleanupWorker: About to start purgeOldDeletedNodes ");
}
purgedNodes = purgeOldDeletedNodes(minPurgeAgeMs);
logger.debug(purgedNodes);
if (logger.isDebugEnabled())
{
logger.debug("DeletedNodeCleanupWorker: About to start purgeOldEmptyTransactions ");
}
purgedTxns = purgeOldEmptyTransactions(minPurgeAgeMs);
}
if (logger.isDebugEnabled())
{
logger.debug(purgedTxns);
}
List<String> allResults = new ArrayList<>(100);
allResults.addAll(purgedNodes);
allResults.addAll(purgedTxns);
// Done
return allResults;
}
@@ -110,7 +159,17 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
this.purgeSize = purgeSize;
}
/**
public void setAlgorithm(String algorithm)
{
this.algorithm = algorithm;
}
public void setDeleteBatchSize(int deleteBatchSize)
{
this.deleteBatchSize = deleteBatchSize;
}
/**
* Cleans up deleted nodes that are older than the given minimum age.
*
* @param minAge the minimum age of a transaction or deleted node
@@ -122,10 +181,12 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
final long maxCommitTime = System.currentTimeMillis() - minAge;
long fromCommitTime = fromCustomCommitTime;
if (fromCommitTime <= 0L)
{
fromCommitTime = nodeDAO.getMinTxnCommitTimeForDeletedNodes().longValue();
}
if ( fromCommitTime == 0L )
{
String msg = "There are no old nodes to purge.";
@@ -134,7 +195,10 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
}
long loopPurgeSize = purgeSize;
Long purgeCount = new Long(0);
if(logger.isDebugEnabled())
{
logger.debug("DeletedNodeCleanupWorker: purgeOldDeletedNodes started ");
}
while (true)
{
// Ensure we keep the lock
@@ -153,9 +217,9 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
try
{
DeleteNodesByTransactionsCallback purgeNodesCallback = new DeleteNodesByTransactionsCallback(nodeDAO, fromCommitTime, toCommitTime);
purgeCount = txnHelper.doInTransaction(purgeNodesCallback, false, true);
Long purgeCount = txnHelper.doInTransaction(purgeNodesCallback, false, true);
if (purgeCount.longValue() > 0)
if (purgeCount > 0)
{
String msg =
"Purged old nodes: \n" +
@@ -220,7 +284,8 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
break;
}
}
logger.debug("DeletedNodeCleanupWorker: purgeOldDeletedNodes finished ");
// Done
return results;
}
@@ -245,6 +310,10 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
{
fromCommitTime = nodeDAO.getMinUnusedTxnCommitTime().longValue();
}
if(logger.isDebugEnabled())
{
logger.debug("DeletedNodeCleanupWorker: purgeOldEmptyTransactions started ");
}
// delete unused transactions in batches of size 'purgeTxnBlockSize'
while (true)
{
@@ -298,14 +367,46 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
}
fromCommitTime += purgeSize;
if(fromCommitTime >= maxCommitTime)
if (fromCommitTime >= maxCommitTime)
{
break;
break;
}
}
logger.debug("DeletedNodeCleanupWorker: purgeOldEmptyTransactions finished ");
// Done
return results;
}
private List<String> purgeOldDeletedNodesV2(long minAge)
{
refreshLock();
final List<String> returnList = new ArrayList<>();
RetryingTransactionHelper txnHelper = transactionService.getRetryingTransactionHelper();
RetryingTransactionCallback<Void> callback = () -> {
returnList.addAll(nodeDAO.purgeDeletedNodes(minAge, deleteBatchSize));
return null;
};
txnHelper.doInTransaction(callback, false, true);
return returnList;
}
private List<String> purgeOldEmptyTransactionsV2(long minAge)
{
refreshLock();
final List<String> returnList = new ArrayList<>();
RetryingTransactionHelper txnHelper = transactionService.getRetryingTransactionHelper();
RetryingTransactionCallback<Void> callback = () -> {
returnList.addAll(nodeDAO.purgeEmptyTransactions(minAge, deleteBatchSize));
return null;
};
txnHelper.doInTransaction(callback, false, true);
return returnList;
}
private static abstract class DeleteByTransactionsCallback implements RetryingTransactionCallback<Long>
{
@@ -356,4 +457,5 @@ public class DeletedNodeCleanupWorker extends AbstractNodeCleanupWorker
return count;
}
}
}
}

View File

@@ -1,33 +1,33 @@
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.tenant;
import java.sql.SQLException;
import org.apache.commons.dbcp.BasicDataSource;
import org.apache.commons.dbcp2.BasicDataSource;
/**
* Experimental
@@ -41,7 +41,7 @@ public class TenantBasicDataSource extends BasicDataSource
{
// tenant-specific
this.setUrl(tenantUrl);
this.setMaxActive(tenantMaxActive == -1 ? bds.getMaxActive() : tenantMaxActive);
this.setMaxTotal(tenantMaxActive == -1 ? bds.getMaxTotal() : tenantMaxActive);
// defaults/overrides - see also 'baseDefaultDataSource' (core-services-context.xml + repository.properties)
@@ -54,7 +54,7 @@ public class TenantBasicDataSource extends BasicDataSource
this.setMaxIdle(bds.getMaxIdle());
this.setDefaultAutoCommit(bds.getDefaultAutoCommit());
this.setDefaultTransactionIsolation(bds.getDefaultTransactionIsolation());
this.setMaxWait(bds.getMaxWait());
this.setMaxWaitMillis(bds.getMaxWaitMillis());
this.setValidationQuery(bds.getValidationQuery());
this.setTimeBetweenEvictionRunsMillis(bds.getTimeBetweenEvictionRunsMillis());
this.setMinEvictableIdleTimeMillis(bds.getMinEvictableIdleTimeMillis());
@@ -62,7 +62,7 @@ public class TenantBasicDataSource extends BasicDataSource
this.setTestOnBorrow(bds.getTestOnBorrow());
this.setTestOnReturn(bds.getTestOnReturn());
this.setTestWhileIdle(bds.getTestWhileIdle());
this.setRemoveAbandoned(bds.getRemoveAbandoned());
this.setRemoveAbandonedOnBorrow(bds.getRemoveAbandonedOnBorrow());
this.setRemoveAbandonedTimeout(bds.getRemoveAbandonedTimeout());
this.setPoolPreparedStatements(bds.isPoolPreparedStatements());
this.setMaxOpenPreparedStatements(bds.getMaxOpenPreparedStatements());

View File

@@ -1,28 +1,28 @@
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2016 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.tenant;
import java.sql.SQLException;
@@ -32,7 +32,7 @@ import java.util.Map;
import javax.sql.DataSource;
import org.alfresco.repo.security.authentication.AuthenticationUtil;
import org.apache.commons.dbcp.BasicDataSource;
import org.apache.commons.dbcp2.BasicDataSource;
import org.springframework.extensions.surf.util.ParameterCheck;
import org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource;

View File

@@ -30,7 +30,7 @@
<property name="initialSize">
<value>0</value>
</property>
<property name="maxActive">
<property name="maxTotal">
<value>1</value>
</property>
<property name="maxIdle">

View File

@@ -156,7 +156,7 @@
<bean id="defaultDataSource" parent="baseDefaultDataSource" />
<!-- Datasource bean -->
<bean id="baseDefaultDataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close" abstract="true">
<bean id="baseDefaultDataSource" class="org.apache.commons.dbcp2.BasicDataSource" destroy-method="close" abstract="true">
<property name="driverClassName">
<value>${db.driver}</value>
</property>
@@ -172,7 +172,7 @@
<property name="initialSize" >
<value>${db.pool.initial}</value>
</property>
<property name="maxActive" >
<property name="maxTotal" >
<value>${db.pool.max}</value>
</property>
<property name="minIdle" >
@@ -187,7 +187,7 @@
<property name="defaultTransactionIsolation" >
<value>${db.txn.isolation}</value>
</property>
<property name="maxWait" >
<property name="maxWaitMillis" >
<value>${db.pool.wait.max}</value>
</property>
<property name="validationQuery" >
@@ -211,7 +211,7 @@
<property name="testWhileIdle" >
<value>${db.pool.evict.validate}</value>
</property>
<property name="removeAbandoned" >
<property name="removeAbandonedOnBorrow" >
<value>${db.pool.abandoned.detect}</value>
</property>
<property name="removeAbandonedTimeout" >

View File

@@ -203,6 +203,7 @@ Inbound settings from iBatis
<mapper resource="alfresco/ibatis/#resource.dialect#/content-insert-SqlMap.xml"/>
<mapper resource="alfresco/ibatis/#resource.dialect#/node-common-SqlMap.xml"/>
<mapper resource="alfresco/ibatis/#resource.dialect#/node-select-children-SqlMap.xml"/>
<mapper resource="alfresco/ibatis/#resource.dialect#/node-select-SqlMap.xml"/>
<mapper resource="alfresco/ibatis/#resource.dialect#/node-update-SqlMap.xml"/>
<mapper resource="alfresco/ibatis/#resource.dialect#/node-delete-SqlMap.xml"/>
<mapper resource="alfresco/ibatis/#resource.dialect#/node-insert-SqlMap.xml"/>

View File

@@ -1505,5 +1505,32 @@
where
commit_time_ms > #{minCommitTime}
</select>
<delete id="delete_NodesById" parameterType="list">
delete from alf_node
where
id IN
<foreach item="item" index="index" collection="list" open="(" separator="," close=")">
#{item}
</foreach>
</delete>
<delete id="delete_NodePropsByNodeId" parameterType="list">
delete from alf_node_properties
where
node_id IN
<foreach item="item" index="index" collection="list" open="(" separator="," close=")">
#{item}
</foreach>
</delete>
<delete id="delete_Txns_UnusedById" parameterType="list">
delete from alf_transaction
where
id in
<foreach item="item" index="index" collection="list" open="(" separator="," close=")">
#{item}
</foreach>
</delete>
</mapper>

View File

@@ -0,0 +1,33 @@
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd">
<mapper namespace="alfresco.node.select">
<select id="select_Deleted_NodesByTxnCommitTime" parameterType="TransactionQuery" fetchSize="100000" resultType="java.lang.Long">
select
node.id
from
alf_node node
join alf_transaction txn on (node.transaction_id = txn.id)
where
node.type_qname_id = #{typeQNameId}
<![CDATA[and commit_time_ms < #{maxCommitTime}]]>
</select>
<select id="select_Txns_UnusedByTxnCommitTime" parameterType="TransactionQuery" fetchSize="100000" resultType="java.lang.Long">
select
id
from alf_transaction
where not exists
(
select 1
from
alf_node node
where
node.transaction_id = alf_transaction.id
)
<![CDATA[and commit_time_ms <= #{maxCommitTime}]]>
</select>
</mapper>

View File

@@ -27,5 +27,5 @@
txn.commit_time_ms < #{maxCommitTime})
]]>
</delete>
</mapper>

View File

@@ -0,0 +1,33 @@
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd">
<mapper namespace="alfresco.node.select">
<select id="select_Deleted_NodesByTxnCommitTime" parameterType="TransactionQuery" fetchSize="-2147483648" resultType="java.lang.Long">
select
node.id
from
alf_node node
join alf_transaction txn on (node.transaction_id = txn.id)
where
node.type_qname_id = #{typeQNameId}
<![CDATA[and commit_time_ms < #{maxCommitTime}]]>
</select>
<select id="select_Txns_UnusedByTxnCommitTime" parameterType="TransactionQuery" fetchSize="-2147483648" resultType="java.lang.Long">
select
id
from alf_transaction
where not exists
(
select 1
from
alf_node node
where
node.transaction_id = alf_transaction.id
)
<![CDATA[and commit_time_ms <= #{maxCommitTime}]]>
</select>
</mapper>

View File

@@ -238,8 +238,14 @@
<property name="purgeSize">
<value>${index.tracking.purgeSize}</value>
</property>
<property name="algorithm">
<value>${system.node_table_cleaner.algorithm}</value>
</property>
<property name="deleteBatchSize">
<value>${system.node_cleanup.delete_batchSize}</value>
</property>
</bean>
<!-- String length adjustment -->
<bean id="nodeStringLengthWorker" class="org.alfresco.repo.node.db.NodeStringLengthWorker">
<constructor-arg index="0" ref="nodeDAO" />

View File

@@ -3,7 +3,7 @@
repository.name=Main Repository
# Schema number
version.schema=16000
version.schema=16200
# Directory configuration
@@ -1246,6 +1246,11 @@ system.delete_not_exists.read_only=false
system.delete_not_exists.timeout_seconds=-1
system.prop_table_cleaner.algorithm=V2
# --Node cleanup batch - default settings
system.node_cleanup.delete_batchSize=1000
system.node_table_cleaner.algorithm=V1
# Configure the system-wide (ACS) settings for direct access urls.
#
# For Direct Access URLs to be usable on the service-layer, the feature must be enabled both system-wide and on the

View File

@@ -87,7 +87,8 @@ import org.junit.runners.Suite;
org.alfresco.repo.node.cleanup.TransactionCleanupTest.class,
org.alfresco.repo.security.person.GetPeopleCannedQueryTest.class,
org.alfresco.repo.domain.schema.script.DeleteNotExistsExecutorTest.class
org.alfresco.repo.domain.schema.script.DeleteNotExistsExecutorTest.class,
org.alfresco.repo.node.cleanup.DeletedNodeBatchCleanupTest.class
})
public class AllDBTestsTestSuite
{

View File

@@ -84,6 +84,7 @@ import org.junit.runners.Suite;
org.alfresco.repo.node.archive.ArchiveAndRestoreTest.class,
org.alfresco.repo.node.db.DbNodeServiceImplTest.class,
org.alfresco.repo.node.cleanup.TransactionCleanupTest.class,
org.alfresco.repo.node.cleanup.DeletedNodeBatchCleanupTest.class,
org.alfresco.repo.node.db.DbNodeServiceImplPropagationTest.class,
})
public class AppContext03TestSuite

View File

@@ -57,7 +57,7 @@ import org.alfresco.service.cmr.workflow.WorkflowAdminService;
import org.alfresco.service.descriptor.Descriptor;
import org.alfresco.service.transaction.TransactionService;
import org.alfresco.traitextender.SpringExtensionBundle;
import org.apache.commons.dbcp.BasicDataSource;
import org.apache.commons.dbcp2.BasicDataSource;
import org.junit.Before;
import org.junit.Test;

View File

@@ -32,7 +32,7 @@ import org.alfresco.heartbeat.jobs.HeartBeatJobScheduler;
import org.alfresco.repo.descriptor.DescriptorDAO;
import org.alfresco.service.cmr.repository.HBDataCollectorService;
import org.alfresco.service.descriptor.Descriptor;
import org.apache.commons.dbcp.BasicDataSource;
import org.apache.commons.dbcp2.BasicDataSource;
import org.junit.Before;
import org.junit.Test;

View File

@@ -0,0 +1,366 @@
/*
* #%L
* Alfresco Repository
* %%
* Copyright (C) 2005 - 2022 Alfresco Software Limited
* %%
* This file is part of the Alfresco software.
* If the software was purchased under a paid Alfresco license, the terms of
* the paid license agreement will prevail. Otherwise, the software is
* provided under the following open source license terms:
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
* #L%
*/
package org.alfresco.repo.node.cleanup;
import static java.util.stream.Collectors.toList;
import static java.util.stream.Stream.of;
import javax.transaction.UserTransaction;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.IntStream;
import java.util.stream.Stream;
import org.alfresco.model.ContentModel;
import org.alfresco.repo.cache.SimpleCache;
import org.alfresco.repo.domain.node.NodeDAO;
import org.alfresco.repo.domain.node.Transaction;
import org.alfresco.repo.domain.node.ibatis.NodeDAOImpl;
import org.alfresco.repo.node.db.DeletedNodeCleanupWorker;
import org.alfresco.repo.transaction.AlfrescoTransactionSupport;
import org.alfresco.repo.transaction.RetryingTransactionHelper;
import org.alfresco.repo.transaction.RetryingTransactionHelper.RetryingTransactionCallback;
import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.repository.NodeService;
import org.alfresco.service.cmr.repository.StoreRef;
import org.alfresco.service.cmr.search.SearchService;
import org.alfresco.service.cmr.security.AuthenticationService;
import org.alfresco.service.namespace.NamespaceService;
import org.alfresco.service.namespace.QName;
import org.alfresco.service.transaction.TransactionService;
import org.alfresco.test_category.OwnJVMTestsCategory;
import org.alfresco.util.BaseSpringTest;
import org.alfresco.util.testing.category.DBTests;
import org.junit.Before;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.extensions.webscripts.GUID;
import org.springframework.test.annotation.DirtiesContext;
import org.springframework.test.annotation.DirtiesContext.ClassMode;
@Category({ OwnJVMTestsCategory.class, DBTests.class })
@DirtiesContext(classMode = ClassMode.BEFORE_EACH_TEST_METHOD)
public class DeletedNodeBatchCleanupTest extends BaseSpringTest
{
@Autowired
private AuthenticationService authenticationService;
@Autowired
private NodeDAO nodeDAO;
@Autowired
@Qualifier("node.nodesSharedCache")
private SimpleCache<Serializable, Serializable> nodesCache;
@Autowired
private DeletedNodeCleanupWorker worker;
@Autowired
private NamespaceService namespaceService;
@Autowired
private TransactionService transactionService;
@Autowired
private NodeService nodeService;
@Autowired
private SearchService searchService;
private RetryingTransactionHelper helper;
private List<NodeRef> testNodes;
@Before
public void before()
{
helper = transactionService.getRetryingTransactionHelper();
authenticationService.authenticate("admin", "admin".toCharArray());
resetWorkerConfig();
// create 5 test nodes
final NodeRef companyHome = getCompanyHome();
testNodes = IntStream.range(0, 5)
.mapToObj(i -> helper.doInTransaction(createNodeCallback(companyHome), false, true))
.collect(toList());
// clean up pre-existing data
helper.doInTransaction(() -> worker.doClean(), false, true);
}
private void resetWorkerConfig()
{
worker.setMinPurgeAgeDays(0);
worker.setAlgorithm("V2");
worker.setDeleteBatchSize(20);
}
private NodeRef getCompanyHome()
{
StoreRef storeRef = new StoreRef(StoreRef.PROTOCOL_WORKSPACE, "SpacesStore");
NodeRef storeRoot = nodeService.getRootNode(storeRef);
List<NodeRef> nodeRefs = searchService.selectNodes(storeRoot, "/app:company_home", null, namespaceService,
false);
return nodeRefs.get(0);
}
private RetryingTransactionCallback<NodeRef> createNodeCallback(NodeRef companyHome)
{
return () -> nodeService.createNode(
companyHome, ContentModel.ASSOC_CONTAINS, QName.createQName("test", GUID.generate()),
ContentModel.TYPE_CONTENT).getChildRef();
}
private void deleteNodes(NodeRef nodeRef, NodeRef... additionalNodeRefs)
{
Stream.concat(of(nodeRef), of(additionalNodeRefs))
.forEach(this::deleteNode);
}
private void deleteNode(NodeRef nodeRef)
{
helper.doInTransaction(new DeleteNode(nodeRef), false, true);
}
@Test
public void testPurgeNodesDeleted()
{
final NodeRef nodeRef4 = getNode(4);
final NodeRef nodeRef5 = getNode(5);
// delete nodes 4 and 5
deleteNodes(nodeRef4, nodeRef5);
// double-check that node 4 and 5 are present in deleted form
nodesCache.clear();
assertTrue("Node 4 is deleted but not purged", nodeDAO.getNodeRefStatus(nodeRef4).isDeleted());
assertTrue("Node 5 is deleted but not purged", nodeDAO.getNodeRefStatus(nodeRef5).isDeleted());
worker.doClean();
// verify that node 4 and 5 were purged
nodesCache.clear();
assertNull("Node 4 was not cleaned up", nodeDAO.getNodeRefStatus(nodeRef4));
assertNull("Node 5 was not cleaned up", nodeDAO.getNodeRefStatus(nodeRef5));
}
@Test
public void testNodesDeletedNotPurgedWhenNotAfterPurgeAge()
{
final NodeRef nodeRef1 = getNode(1);
final NodeRef nodeRef2 = getNode(2);
// delete nodes 1 and 2
deleteNodes(nodeRef1, nodeRef2);
// double-check that node 1 and 2 are present in deleted form
nodesCache.clear();
assertTrue("Node 1 is deleted but not purged", nodeDAO.getNodeRefStatus(nodeRef1).isDeleted());
assertTrue("Node 2 is deleted but not purged", nodeDAO.getNodeRefStatus(nodeRef2).isDeleted());
// run the worker
worker.setMinPurgeAgeDays(1);
worker.doClean();
// verify that node 1 and 2 were not purged
nodesCache.clear();
assertNotNull("Node 1 was cleaned up", nodeDAO.getNodeRefStatus(nodeRef1));
assertNotNull("Node 2 was cleaned up", nodeDAO.getNodeRefStatus(nodeRef2));
}
@Test
public void testPurgeUnusedTransactions() throws Exception
{
// Execute transactions that update a number of nodes. For nodeRef1, all but the last txn will be unused.
final long start = System.currentTimeMillis();
final Long minTxnId = nodeDAO.getMinTxnId();
final Map<NodeRef, List<String>> txnIds = createTransactions();
final List<String> txnIds1 = txnIds.get(getNode(1));
final List<String> txnIds2 = txnIds.get(getNode(2));
final List<String> txnIds3 = txnIds.get(getNode(3));
// Double-check that n4 and n5 are present in deleted form
nodesCache.clear();
UserTransaction txn = transactionService.getUserTransaction(true);
txn.begin();
try
{
assertTrue("Node 4 is deleted but not purged", nodeDAO.getNodeRefStatus(getNode(4)).isDeleted());
assertTrue("Node 5 is deleted but not purged", nodeDAO.getNodeRefStatus(getNode(5)).isDeleted());
}
finally
{
txn.rollback();
}
// run the transaction cleaner
worker.doClean();
// Get transactions committed after the test started
RetryingTransactionHelper.RetryingTransactionCallback<List<Transaction>> getTxnsCallback = () -> ((NodeDAOImpl) nodeDAO).selectTxns(
start, Long.MAX_VALUE, Integer.MAX_VALUE, null, null, true);
List<Transaction> txns = transactionService.getRetryingTransactionHelper()
.doInTransaction(getTxnsCallback, true, false);
List<String> expectedUnusedTxnIds = new ArrayList<>(10);
expectedUnusedTxnIds.addAll(txnIds1.subList(0, txnIds1.size() - 1));
List<String> expectedUsedTxnIds = new ArrayList<>(5);
expectedUsedTxnIds.add(txnIds1.get(txnIds1.size() - 1));
expectedUsedTxnIds.addAll(txnIds2);
expectedUsedTxnIds.addAll(txnIds3);
// 4 and 5 should not be in the list because they are deletes
// check that the correct transactions have been purged i.e. all except the last one to update the node
// i.e. in this case, all but the last one in txnIds1
List<String> unusedTxnsNotPurged = expectedUnusedTxnIds.stream()
.filter(txnId -> containsTransaction(txns, txnId))
.collect(toList());
if (!unusedTxnsNotPurged.isEmpty())
{
fail("Unused transaction(s) were not purged: " + unusedTxnsNotPurged);
}
long numFoundUnusedTxnIds = expectedUnusedTxnIds.stream()
.filter(txnId -> !containsTransaction(txns, txnId))
.count();
assertEquals(9, numFoundUnusedTxnIds);
// check that the correct transactions remain i.e. all those in txnIds2, txnIds3, txnIds4 and txnIds5
long numFoundUsedTxnIds = expectedUsedTxnIds.stream()
.filter(txnId -> containsTransaction(txns, txnId))
.count();
assertEquals(3, numFoundUsedTxnIds);
// Get transactions committed after the test started
RetryingTransactionHelper.RetryingTransactionCallback<List<Long>> getTxnsUnusedCallback = () -> nodeDAO.getTxnsUnused(
minTxnId, Long.MAX_VALUE, Integer.MAX_VALUE);
List<Long> txnsUnused = transactionService.getRetryingTransactionHelper()
.doInTransaction(getTxnsUnusedCallback, true, false);
assertEquals(0, txnsUnused.size());
// Double-check that n4 and n5 were removed as well
nodesCache.clear();
assertNull("Node 4 was not cleaned up", nodeDAO.getNodeRefStatus(getNode(4)));
assertNull("Node 5 was not cleaned up", nodeDAO.getNodeRefStatus(getNode(5)));
}
private boolean containsTransaction(List<Transaction> txns, String txnId)
{
return txns.stream()
.map(Transaction::getChangeTxnId)
.filter(changeTxnId -> changeTxnId.equals(txnId))
.map(match -> true)
.findFirst()
.orElse(false);
}
private Map<NodeRef, List<String>> createTransactions()
{
Map<NodeRef, List<String>> txnIds = new HashMap<>();
UpdateNode updateNode1 = new UpdateNode(getNode(1));
UpdateNode updateNode2 = new UpdateNode(getNode(2));
UpdateNode updateNode3 = new UpdateNode(getNode(3));
DeleteNode deleteNode4 = new DeleteNode(getNode(4));
DeleteNode deleteNode5 = new DeleteNode(getNode(5));
List<String> txnIds1 = new ArrayList<>();
List<String> txnIds2 = new ArrayList<>();
List<String> txnIds3 = new ArrayList<>();
List<String> txnIds4 = new ArrayList<>();
List<String> txnIds5 = new ArrayList<>();
txnIds.put(getNode(1), txnIds1);
txnIds.put(getNode(2), txnIds2);
txnIds.put(getNode(3), txnIds3);
txnIds.put(getNode(4), txnIds4);
txnIds.put(getNode(5), txnIds5);
for (int i = 0; i < 10; i++)
{
String txnId1 = helper.doInTransaction(updateNode1, false, true);
txnIds1.add(txnId1);
if (i == 0)
{
String txnId2 = helper.doInTransaction(updateNode2, false, true);
txnIds2.add(txnId2);
}
if (i == 1)
{
String txnId3 = helper.doInTransaction(updateNode3, false, true);
txnIds3.add(txnId3);
}
}
String txnId4 = helper.doInTransaction(deleteNode4, false, true);
txnIds4.add(txnId4);
String txnId5 = helper.doInTransaction(deleteNode5, false, true);
txnIds5.add(txnId5);
return txnIds;
}
private class UpdateNode implements RetryingTransactionHelper.RetryingTransactionCallback<String>
{
private final NodeRef nodeRef;
UpdateNode(NodeRef nodeRef)
{
this.nodeRef = nodeRef;
}
@Override
public String execute() throws Throwable
{
nodeService.setProperty(nodeRef, ContentModel.PROP_NAME, GUID.generate());
return AlfrescoTransactionSupport.getTransactionId();
}
}
private class DeleteNode implements RetryingTransactionHelper.RetryingTransactionCallback<String>
{
private final NodeRef nodeRef;
DeleteNode(NodeRef nodeRef)
{
this.nodeRef = nodeRef;
}
@Override
public String execute() throws Throwable
{
nodeService.addAspect(nodeRef, ContentModel.ASPECT_TEMPORARY, null);
nodeService.deleteNode(nodeRef);
return AlfrescoTransactionSupport.getTransactionId();
}
}
private NodeRef getNode(int i)
{
return testNodes.get(i - 1);
}
}

View File

@@ -110,6 +110,8 @@ public class TransactionCleanupTest
this.nodesCache = (SimpleCache<Serializable, Serializable>) ctx.getBean("node.nodesSharedCache");
this.worker = (DeletedNodeCleanupWorker)ctx.getBean("nodeCleanup.deletedNodeCleanup");
this.worker.setMinPurgeAgeDays(0);
this.worker.setAlgorithm("V1");
this.helper = transactionService.getRetryingTransactionHelper();
authenticationService.authenticate("admin", "admin".toCharArray());

View File

@@ -37,7 +37,7 @@
</bean>
<!-- dummy -->
<bean id="defaultDataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
<bean id="defaultDataSource" class="org.apache.commons.dbcp2.BasicDataSource" destroy-method="close">
</bean>
<bean id="dataSource" class="org.alfresco.config.JndiObjectFactoryBean">

View File

@@ -1,80 +1,80 @@
<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE beans PUBLIC '-//SPRING//DTD BEAN//EN' 'http://www.springframework.org/dtd/spring-beans.dtd'>
<beans>
<import resource="classpath:alfresco/extension/dev-context.xml" />
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource"
destroy-method="close">
<property name="driverClassName">
<value>${db.driver}</value>
</property>
<property name="url">
<value>${db.url}</value>
</property>
<property name="username">
<value>${db.username}</value>
</property>
<property name="password">
<value>${db.password}</value>
</property>
<property name="initialSize">
<value>${db.pool.initial}</value>
</property>
<property name="maxActive">
<value>${db.pool.max}</value>
</property>
<property name="minIdle">
<value>${db.pool.min}</value>
</property>
<property name="maxIdle">
<value>${db.pool.idle}</value>
</property>
<property name="defaultAutoCommit">
<value>false</value>
</property>
<property name="defaultTransactionIsolation">
<value>${db.txn.isolation}</value>
</property>
<property name="maxWait">
<value>${db.pool.wait.max}</value>
</property>
<property name="validationQuery">
<value>${db.pool.validate.query}</value>
</property>
<property name="timeBetweenEvictionRunsMillis">
<value>${db.pool.evict.interval}</value>
</property>
<property name="minEvictableIdleTimeMillis">
<value>${db.pool.evict.idle.min}</value>
</property>
<property name="testOnBorrow">
<value>${db.pool.validate.borrow}</value>
</property>
<property name="testOnReturn">
<value>${db.pool.validate.return}</value>
</property>
<property name="testWhileIdle">
<value>${db.pool.evict.validate}</value>
</property>
<property name="removeAbandoned">
<value>${db.pool.abandoned.detect}</value>
</property>
<property name="removeAbandonedTimeout">
<value>${db.pool.abandoned.time}</value>
</property>
<property name="poolPreparedStatements">
<value>${db.pool.statements.enable}</value>
</property>
<property name="maxOpenPreparedStatements">
<value>${db.pool.statements.max}</value>
</property>
</bean>
<bean id="transactionManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="transactionSynchronizationName" value="SYNCHRONIZATION_ALWAYS" />
<property name="dataSource" ref="dataSource" />
</bean>
<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE beans PUBLIC '-//SPRING//DTD BEAN//EN' 'http://www.springframework.org/dtd/spring-beans.dtd'>
<beans>
<import resource="classpath:alfresco/extension/dev-context.xml" />
<bean id="dataSource" class="org.apache.commons.dbcp2.BasicDataSource"
destroy-method="close">
<property name="driverClassName">
<value>${db.driver}</value>
</property>
<property name="url">
<value>${db.url}</value>
</property>
<property name="username">
<value>${db.username}</value>
</property>
<property name="password">
<value>${db.password}</value>
</property>
<property name="initialSize">
<value>${db.pool.initial}</value>
</property>
<property name="maxTotal">
<value>${db.pool.max}</value>
</property>
<property name="minIdle">
<value>${db.pool.min}</value>
</property>
<property name="maxIdle">
<value>${db.pool.idle}</value>
</property>
<property name="defaultAutoCommit">
<value>false</value>
</property>
<property name="defaultTransactionIsolation">
<value>${db.txn.isolation}</value>
</property>
<property name="maxWaitMillis">
<value>${db.pool.wait.max}</value>
</property>
<property name="validationQuery">
<value>${db.pool.validate.query}</value>
</property>
<property name="timeBetweenEvictionRunsMillis">
<value>${db.pool.evict.interval}</value>
</property>
<property name="minEvictableIdleTimeMillis">
<value>${db.pool.evict.idle.min}</value>
</property>
<property name="testOnBorrow">
<value>${db.pool.validate.borrow}</value>
</property>
<property name="testOnReturn">
<value>${db.pool.validate.return}</value>
</property>
<property name="testWhileIdle">
<value>${db.pool.evict.validate}</value>
</property>
<property name="removeAbandonedOnBorrow">
<value>${db.pool.abandoned.detect}</value>
</property>
<property name="removeAbandonedTimeout">
<value>${db.pool.abandoned.time}</value>
</property>
<property name="poolPreparedStatements">
<value>${db.pool.statements.enable}</value>
</property>
<property name="maxOpenPreparedStatements">
<value>${db.pool.statements.max}</value>
</property>
</bean>
<bean id="transactionManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="transactionSynchronizationName" value="SYNCHRONIZATION_ALWAYS" />
<property name="dataSource" ref="dataSource" />
</bean>
</beans>

View File

@@ -6,7 +6,7 @@
<import resource="classpath:test/alfresco/test-context.xml" />
<!-- Datasource bean -->
<bean id="testDataSource" class="org.apache.commons.dbcp.BasicDataSource"
<bean id="testDataSource" class="org.apache.commons.dbcp2.BasicDataSource"
destroy-method="close">
<property name="driverClassName">
<value>${db.driver}</value>
@@ -23,7 +23,7 @@
<property name="initialSize">
<value>${db.pool.initial}</value>
</property>
<property name="maxActive">
<property name="maxTotal">
<value>${db.pool.max}</value>
</property>
<property name="minIdle">
@@ -38,7 +38,7 @@
<property name="defaultTransactionIsolation">
<value>${db.txn.isolation}</value>
</property>
<property name="maxWait">
<property name="maxWaitMillis">
<value>${db.pool.wait.max}</value>
</property>
<property name="validationQuery">
@@ -59,7 +59,7 @@
<property name="testWhileIdle">
<value>${db.pool.evict.validate}</value>
</property>
<property name="removeAbandoned">
<property name="removeAbandonedOnBorrow">
<value>${db.pool.abandoned.detect}</value>
</property>
<property name="removeAbandonedTimeout">