Jump to content


  • Posts

  • Joined

  • Last visited

Profile Information

  • Gender
    Not Telling

MindGamer's Achievements

  1. I have a lot of pages with different interactive applications that use Tweens but aren't part of a timeline. If all of those tweening elements were contained within a parent div, like this: <div id="gsaparea"> <!-- all active tweens live here --> </div> Is it possible to kill all tweening elements that are children of that div, by referencing that div id somehow? What I'm hoping for is a function like killTweensInside("divname"); Is that possible? Thanks in advance.
  2. Yes, you're right, I should have been more clear. As you said, for Cumulative Layout Shift, only animations that affect height/width would make a difference. The issue seems to be on the Paint side of things. (Which it remains to be seen if they actually penalize us for). The issue is more LCP. If a DOM element is (for example) rotating, it looks like Google treats it as "still painting" from what I can see.
  3. Ok.. thanks. Unfortunately I think this is going to be a problem with DOM animation of all kinds. It seems that an animated DOM element is an "unfinished layout" from Google's perspective. Three things that I'm playing with that may help other users ... I think it remains to be seen what works... but these may be worth a try? 1) Largest Contentful Paint is literally the largest element on the page. A placeholder loading image in the largest element (rather than having the element hidden or empty while loading) may 'count' as being already painted. 2) Delaying animation for a full second or two after DOMContentLoaded might also serve to separate the animation from the initial content paint metric. (This seems a little hacky though) 3) Not a great solution but maybe necessary: Start all animations on a user interaction and not automatically. Hopefully Google offers some better guidance on this, otherwise page animations onload might become a liability for SEO.
  4. I'm noticing that many (if not most) of the pages where I have used GSAP are getting warnings in Google's Search Console under the newly added "Core Web Vitals" metrics. The metrics like First Contentful Paint, Largest Contentful Paint, Cumulative Layout Shift are all easy to impact using GSAP. Are there some general guidelines or rules-of-thumb that we should be following? Currently I load a main application using a Jquery onReady to trigger the load of the application. Apparently that's no bueno. But I'm not sure what the solution is because if I do it sooner, it's render-blocking javascript. If I do it later it's a slow LCP. Is there a best practice? Google isn't using CWV as ranking factors yet, but they say they plan to (probably not this year, I would imagine -- as they've just introduced the new metrics). I'm hoping to get my ducks in a row before this impacts my SEO. Thanks
  5. Very general question here: I'm in mid-production on the homepage for http://wordmetrics.com and I'm thinking I've probably crossed a line in the sand in terms of CPU overhead. (It's live for now, but it's still a work in progress -- and mobile isn't really well implemented yet, I know) I'm thinking I'll probably start/stop animations on page-scroll levels in order to only animate 1 area at a time, (or move over to Canvas if that doesn't work) but what's the best approach to getting a read on "how much is too much" in terms of overhead? I should probably figure this out before adding another 5 animations to the page! Thanks in advance
  6. IMHO it's not really clear what the problem is. It sounds like the issue could be solved by either modifying the grouping within Affinity prior to SVG export, or doing a replace-all on any positions in your SVG using a text editor. No?
  7. Awesome! That worked! I set ::-webkit-scrollbar-track using a CSS variable, and then tweened the variable with GSAP per your Codepen above. 'Works perfectly. Scrollbars now looking very sexy in dark mode : ) Thanks!
  8. I just wrote a simple "dark mode" switch for an SPA I'm building and I use GSAP to tween/toggle the application colors from light to dark colors, and back. It's working fine, but I'm having trouble tweening the colors of the scrollbars on the page. I have two divs with overflow-y: scroll; and the initial scrollbar styling is handled in the CSS with the ::-webkit-scrollbar-track pseudo-selector. Unfortunately the scrollbars remain a bright white color after everything else goes "dark" and it ruins the whole effect. I've tried TweenMax.to('::-webkit-scrollbar-track',1,{backgroundColor: "#000000"}); But that doesn't seem to work. And I've tried TweenMax.to('body',1,{scrollbarTrackColor: "#000000"}); ...and no dice. Is there any way to use GSAP to tween the scrollbar colors along with the rest of the page elements I'm adjusting?
  9. Before DrawSVG it was maddening trying to work with SVG's across browsers.
  10. Looks silky smooth to me, running in Chrome on my Macbook Pro 2015.
  11. There may be a better way to to this, but I'd probably skip the repeat and the yoyo and call a function onComplete. That function would set a new set of random "to" coordinates, and then start another tween which calls the same function onComplete for a new set of randomized "to" coordinates. Etc. Etc. Cool CSS grain btw.
  12. This is my workflow as well. Although I do the same thing with Affinity Designer to avoid the Adobe tax. It works perfectly. Grouped layers in Designer get ID'd correctly in the SVG output. Some things don't output well in either Illustrator or Designer though when it comes to SVG output. Layer effects are dicey. Some fill types are quirky. Transform Matrix does odd things. Etc. The trick is to use just simple lines and fills without using more advanced vector features/effects.
  13. MindGamer

    SEO and GSAP

    CSS inside noscript... Good idea. I never thought of that. Thanks.
  14. MindGamer

    SEO and GSAP

    General question about SEO and GSAP I frequently use GSAP to fade-in elements on a page. The elements will typically start with a default CSS opacity set to 0. I'll fade the elements in by tweening the opacity to 1 using GSAP once the DOM has loaded (by using jQuery .ready() or whatever framework I'm working with). I'm just starting to wonder if I'm taking a hit to SEO when I do this? Does anyone know if this technique constitutes "hidden content"? I know that Google supposedly spiders with Chrome and executes Javascript along the way, but what I'm not clear about is whether or not Googlebot takes the requisite time-delay / fade-in into account. Does anyone have any experience with Google penalizing a page for hidden content if the content is faded into view after the page-load has completed? Thanks