[GRADLE-1697] 1.0-milestone-4 a lot slower than 1.0-milestone-3 Created: 28/Jul/11  Updated: 04/Jan/13  Resolved: 08/Oct/11

Status: Resolved
Project: Gradle
Affects Version/s: 1.0-milestone-4
Fix Version/s: 1.0-milestone-5

Type: Bug
Reporter: Russel Winder Assignee: Adam Murdoch
Resolution: Fixed Votes: 4

Attachments: HTML File milestone-3.html     HTML File milestone-4.html    
Issue Links:
Duplicates GRADLE-1714 Remote repositories hit on every buil... Resolved


The project in question is my ADS library – Bazaar branch available for branching at http://www.russel.org.uk/Bazaar/ADS.

Using Milestone 3 and running "gradlew clean" a number of times to ensure all the dependencies were in the Gradle cache and the Gradle cache itself was up to date, the clean of a clean project took about 10s. With Milestone 4 and following the same procedure of running many times to ensure everything was cached, the clean of a clean project took 26s. These timings are a rough average of two or three runs, so not statistically significant just anecdotal. All the while in both cases there was 80kbps of network traffic. NB This is working over an NFS mounted filestore.

It seems therefore that Milestone 4 is having to do an awful lot more work to determine that there is nothing to be done than Milestone 3 did.

Comment by Szczepan Faber [ 28/Jul/11 ]

I did some tests with java multiproject sample and have similar results. It seems that project configuration is much slower that it used to be.

Comment by Adam Murdoch [ 28/Jul/11 ]

Do you see a similar regression when building? Or just when cleaning?

Comment by Russel Winder [ 29/Jul/11 ]

ADS is a single module project. With Milestone 4 I get:

clean: 1m26s
clean: 1m18s
classes: 1m03s
clean: 50s

I think though I changed something that affected the performance between second clean and classes. The "> Loading" marker is visible for a very long time in all cases. With Milestone 3, I get:

clean: 13s
clean: 10s
classes: 22s
clean: 10s

"> Loading shows for a lot less and all other labels appear to show for less as well. So I would hypothesize that loading is significantly slower and all activities are slower. I went back to Milestone 4:

clean: 53s
clean: 51s
classes: 1m03s
clean: 53s

I think there are two issues: firstly Milestone 4 has serious performance problems compared to Milestone 3; and secondly there perhaps should be some performance benchmark checks in the integration/system tests for Gradle, so as to avoid this situation.

I'll make sure I can run these tests again in the future so as to be able to provide some "user acceptance tests"

Comment by Magnus Rundberget [ 29/Jul/11 ]

Hi !

I've experienced similar (or even worse results) with the switch to milestone 4.
(see --profile from milestone-3.html and milestone-4.html).
(ps had to do the verification without tests because my test resources wont show up in classpath on milestone-4)

Looking at the log (-d) it seems ivy is spending a whole lot of time doing remote "things", like checking pom, downloading sha1 and md5 files. (see log snippet below for just one of the dependencies in my build).

22:04:03.221 [DEBUG] [org.gradle.api.internal.artifacts.ivyservice.IvyLoggingAdaper] post 1.3 ivy file: using exact as default matcher
22:04:03.241 [DEBUG] [org.gradle.api.internal.artifacts.ivyservice.IvyLoggingAdaper] found ivy file in cache for org.apache.tapestry#tapestry-ioc;5.2.6 (resolved by MavenRepo): /Users/magnus/.gradle/caches/artifacts/org.apache.tapestry/tapestry-ioc/c12498cf18507aa6433a94eb7d3e77d5/ivy-5.2.6.xml
22:04:03.245 [DEBUG] [org.gradle.api.internal.artifacts.ivyservice.IvyLoggingAdaper] trying http://repo1.maven.org/maven2/org/apache/tapestry/tapestry-ioc/5.2.6/tapestry-ioc-5.2.6.pom
22:04:03.246 [DEBUG] [org.gradle.api.internal.artifacts.ivyservice.IvyLoggingAdaper] tried http://repo1.maven.org/maven2/org/apache/tapestry/tapestry-ioc/5.2.6/tapestry-ioc-5.2.6.pom
22:04:03.458 [DEBUG] [org.gradle.api.internal.artifacts.ivyservice.IvyLoggingAdaper] Retrieving SHA-1 using: 'http://repo1.maven.org/maven2/org/apache/tapestry/tapestry-ioc/5.2.6/tapestry-ioc-5.2.6.pom.sha1'
22:04:03.707 [DEBUG] [org.gradle.api.internal.artifacts.ivyservice.IvyLoggingAdaper] Retrieving MD5 using: 'http://repo1.maven.org/maven2/org/apache/tapestry/tapestry-ioc/5.2.6/tapestry-ioc-5.2.6.pom.md5'
22:04:03.951 [DEBUG] [org.gradle.api.internal.artifacts.ivyservice.IvyLoggingAdaper] MavenRepo: found md file for org.apache.tapestry#tapestry-ioc;5.2.6
22:04:03.951 [DEBUG] [org.gradle.api.internal.artifacts.ivyservice.IvyLoggingAdaper] => http://repo1.maven.org/maven2/org/apache/tapestry/tapestry-ioc/5.2.6/tapestry-ioc-5.2.6.pom (5.2.6)
22:04:03.952 [DEBUG] [org.gradle.api.internal.artifacts.ivyservice.IvyLoggingAdaper] parser = pom parser

Comment by Pivotal Tracker Integration [ 01/Aug/11 ]

A Pivotal Tracker story has been created for this Issue: http://www.pivotaltracker.com/story/show/16488717

Comment by Russel Winder [ 02/Aug/11 ]

Pivotal Tracker reports "You are not allowed to view this page."

Comment by Szczepan Faber [ 04/Aug/11 ]

My findings: https://spreadsheets.google.com/spreadsheet/ccc?key=0AqCbkJHaLQSsdDRZaXBjZUV2Y1hNREtCUWF2N1lvTFE&hl=en_US#gid=0

Comment by Peter Niederwieser [ 04/Aug/11 ]

Here are the numbers for the Spock build:

Clean build (without running tests, all deps already downloaded) takes 38s with m3, and 107s with m4.
Up-to-date build takes 10s with m3, and 100s with m4.

Comment by Adam Murdoch [ 14/Aug/11 ]

See https://issues.jfrog.org/jira/browse/WHARF-27 and https://issues.jfrog.org/jira/browse/WHARF-28.

Comment by Peter Niederwieser [ 06/Oct/11 ]

I tried to reproduce the gradle clean performance regression on the ADS project, but without success. Tried with gradle and gradlew, m3, m4, and m5-SNAPSHOT, with and without network access. Clean time was always between 4 and 5 seconds, with m4 and m5-SNAPSHOT being slightly slower than m3.

Comment by Peter Niederwieser [ 06/Oct/11 ]

Similar result for gradle clean build -x checkstyleMain -x checkstyleTest (again on the ADS project). Always between 15 and 20 seconds, with m5-SNAPSHOT being slightly slower than m3 and m4.

Comment by Russel Winder [ 07/Oct/11 ]

Perhaps all the problems left over are related to NFS rather than any other factor? I assume your experiments were using a local disc rather than an NFS mounted one?

Certainly on my laptop I see:

> gradlew clean


Total time: 4.733 secs

> gradlew clean


Total time: 4.608 secs

> gradle clean


Total time: 5.717 secs

> gradle clean


Total time: 4.589 secs


whereas on the workstation I see:

> gradlew clean


Total time: 9.181 secs

> gradlew clean


Total time: 4.823 secs

> gradlew clean


Total time: 4.544 secs


I am using only Gradle HEAD as build each morning or the version of Gradle used by Gradle to build itself, so I am not really in a position to experiment further.

In any event, I am not seeing any overly irritating problems any more.

Comment by Adam Murdoch [ 07/Oct/11 ]

@russel, can we close this, then?

Comment by Russel Winder [ 07/Oct/11 ]

I have moved on to milestone-5 which seems to have cured for me the issues that Peter cannot reproduce with milestone-4. As I have no intention of reverting to milestone-3 or milestone-4, that sounds like this report has served its useful purpose.

Comment by Magnus Rundberget [ 07/Oct/11 ]


I also commented on this issue when it was created, so I thought I'd chip in my findings.
(all numbers with already cached artifacts).

The following project git@github.com:rundis/TapestryGorm.git (quite a few dependencies, but few shapshots)
Milestone 3: 59s
(Milestone 4: 1m 57s)
Milestone 5 (27.sep): 1m 6s

A multiproject with a few snapshot dependencies https://github.com/rundis/Dumpling
Milestone 3: 30s
(milestone 4: 1m 6s)
Milestone 5 (27.sep): 41 s
Milestone 5 offline: 34 s (offline didn't work for milestone 3)

So the rather hopeless discouraging from milestone 4 are certainly addressed, but as you can see from my second sample project its still not up there with milestone 3 performance wise.

Haven't dug into the details, but I guess its still a bit chatty in terms of dependencies checking and thats why its not quite there yet ?

Generated at Wed Jun 30 12:02:03 CDT 2021 using Jira 8.4.2#804003-sha1:d21414fc212e3af190e92c2d2ac41299b89402cf.